Thursday, February 28, 2013
Rescuing Cesar
On a hazy, hundred-degree morning three summers ago, during the most difficult time in his life, Cesar Millan drove his silver John Deere Gator high up on a ridge that looks out over his Dog Psychology Center – 43 acres of scorched red-dirt hills and rocky ridges north of Los Angeles, with no indoor plumbing, no air-conditioning, and very little shade. He shut off the engine, wiped dust from his face, and sighed. "Tony Robbins has his island in Fiji," he said, with a smile that seemed hopeful but also a little sad. "I have this."
Millan paid $1.3 million for this land, which is just over the hill from Magic Mountain, and called it "my greatest investment, after dog food." He planned to turn the place into a sanctuary for abandoned dogs, as well as an academy where he'll teach the unconventional training methods he introduced on nine seasons of his hugely successful TV series, 'Dog Whisperer'. "In reality," he said, "it's not about training dogs. It's about training the human to learn from dogs."
So far, not much progress had been made. The only permanent structures were a small office with a wooden desk and some plastic furniture, plus a few dog kennels and a murky above-ground pool. Millan had hoped to rescue 60 dogs that summer – "hardcore, aggressive dogs," he told me. "Dogs on death row." But, he admitted, "I'm not ready."
Earlier that year, in a few awful months at the start of 2010, Millan's life turned upside down. In February, his sidekick Daddy, a giant, gentle red pit bull who frequently assisted Millan on the show and whom he calls "my mentor," died of cancer at age 16. A month later, while he was on tour in Europe, his wife of 16 years, Ilusion, informed him she was filing for divorce. As he was reeling from those blows, Millan discovered that while 'Dog Whisperer' had made him one of America's biggest TV stars, a series of bad business deals had left him with very little in the bank to show for it. "I found out I didn't own anything – just T-shirts and touring," he told me recently. "It was the biggest shock in the world."
Millan remembers walking around in a daze, feeling betrayed and very alone. "I am a pack animal," he said. "Everything I did was to keep the pack together. All of a sudden I had no pack." He slept on his brother's couch, spent time in church, and lost so much weight he dropped four pants sizes. Occasionally, he returned home to visit his family in suburban Santa Clarita, a few miles from the ranch. "We were trying to do the whole thing white people do where they come back and visit," he says now, with a bitter laugh. "But it didn't work for me." Millan's two sons, Andre, then 15, and Calvin, 11, blamed him for the separation and refused to speak to him. "They were brainwashed. . . . They believed their life was better without me," he says. During the worst times, even his dogs kept their distance. "Dogs don't follow an unstable leader," he says. "I was very unstable."
That May, in 2010, Millan hit bottom. "It was a spiral," he says. "All the willpower I had, the desire to motivate myself, my kids, all I had achieved - none of that, nothing, mattered."
One day, at his wife's house, he swallowed a bottle of her Xanax and some other pills and got into bed, hoping to end his life. "I thought, If I do a combination, I can die quicker. So I just took all the pills I could find, poof"
"I had so much rage and sadness," he continues. "I went to the other side of me, which is 'fuck it, I'm a failure.'" Millan woke up in the hospital psychiatric ward, where he remained under observation for 72 hours. "Nothing happened!" he says. "I thought, Well shit, that means I'm not supposed to die. I better get back to work."
I visited Millan at the ranch a few months after his suicide attempt. When I arrived he was lying on a bench in the shade, sweating through a purple polo shirt, with a bottle of Maalox resting on his chest. "I'm still managing the depression, the anger, the insecurity," he told me, "but I am moving forward." A pair of hyperactive huskies belonging to his close friend Jada Pinkett Smith ran through the hills pulling a sled Millan had modified for the rocky terrain. Junior, a sleek, gray three-year-old pit bull he was grooming to take Daddy's place, lay quietly under the bench, watching Millan's every move. "I couldn't have done what I do without Daddy," he said, "and now I can't do it without Junior. There's always a pit bull there supporting me."
Millan is a short, stocky guy – "like a burrito," he says – but he carries himself with a straight back, chest jutted out, a natural alpha. When he arrived in the United States 22 years ago, he knew only a single English word - "OK" - and he still talks in a loose, colloquial SoCal Spanglish, rolling through sentences with mixed-up tenses, calling his dog Blizzard a "Jello Lab," pronouncing buffet with a hard t and sushi as "su-chi." On 'Dog Whisperer,' Millan uses the language deficit to his advantage, putting clients at ease with his always polite, effortlessly funny broken-English banter as he (often painfully) dissects their troubled relationships with their dogs. In person he's just as charming – open, inquisitive, with a quick mind and a slightly rough edge that makes him even more likable. For all his alpha-male poise, Millan also possesses humility, which he says comes with the job. "In my field, working with animals, they detest egotistical people," he says. "Dogs are wise. They don't buy BS. . . . When you are egotistical, you're not grounded. So it's not even an option for me to become disconnected or lose my grounding."
All that summer, Millan spent his days at the ranch, clearing brush, digging roads, and planting trees. "Some people turn to cigarettes and alcohol when they have problems," he said. "I use hard work." When the sadness overwhelmed him, he would hike up the nearly vertical rim of the canyon – rocky, dry scrub thick with rattlesnakes – in heat that reached 115 degrees. If he didn't feel better when he got back down, he'd do it again.
One night, "I was sitting under this tree, right here," he said, pulling up in the Gator next to a giant Buddha statue, "and I was crying. I noticed the dogs started coming over, and they surrounded me. There were, like, 11 dogs all around, and they started to lick my face. Normally I don't like to be licked. I'm afraid of germs, but this was different. I had the sense that these dogs were healing me. From that night, I began to get stronger."
Millan paid $1.3 million for this land, which is just over the hill from Magic Mountain, and called it "my greatest investment, after dog food." He planned to turn the place into a sanctuary for abandoned dogs, as well as an academy where he'll teach the unconventional training methods he introduced on nine seasons of his hugely successful TV series, 'Dog Whisperer'. "In reality," he said, "it's not about training dogs. It's about training the human to learn from dogs."
So far, not much progress had been made. The only permanent structures were a small office with a wooden desk and some plastic furniture, plus a few dog kennels and a murky above-ground pool. Millan had hoped to rescue 60 dogs that summer – "hardcore, aggressive dogs," he told me. "Dogs on death row." But, he admitted, "I'm not ready."
Earlier that year, in a few awful months at the start of 2010, Millan's life turned upside down. In February, his sidekick Daddy, a giant, gentle red pit bull who frequently assisted Millan on the show and whom he calls "my mentor," died of cancer at age 16. A month later, while he was on tour in Europe, his wife of 16 years, Ilusion, informed him she was filing for divorce. As he was reeling from those blows, Millan discovered that while 'Dog Whisperer' had made him one of America's biggest TV stars, a series of bad business deals had left him with very little in the bank to show for it. "I found out I didn't own anything – just T-shirts and touring," he told me recently. "It was the biggest shock in the world."
Millan remembers walking around in a daze, feeling betrayed and very alone. "I am a pack animal," he said. "Everything I did was to keep the pack together. All of a sudden I had no pack." He slept on his brother's couch, spent time in church, and lost so much weight he dropped four pants sizes. Occasionally, he returned home to visit his family in suburban Santa Clarita, a few miles from the ranch. "We were trying to do the whole thing white people do where they come back and visit," he says now, with a bitter laugh. "But it didn't work for me." Millan's two sons, Andre, then 15, and Calvin, 11, blamed him for the separation and refused to speak to him. "They were brainwashed. . . . They believed their life was better without me," he says. During the worst times, even his dogs kept their distance. "Dogs don't follow an unstable leader," he says. "I was very unstable."
That May, in 2010, Millan hit bottom. "It was a spiral," he says. "All the willpower I had, the desire to motivate myself, my kids, all I had achieved - none of that, nothing, mattered."
One day, at his wife's house, he swallowed a bottle of her Xanax and some other pills and got into bed, hoping to end his life. "I thought, If I do a combination, I can die quicker. So I just took all the pills I could find, poof"
"I had so much rage and sadness," he continues. "I went to the other side of me, which is 'fuck it, I'm a failure.'" Millan woke up in the hospital psychiatric ward, where he remained under observation for 72 hours. "Nothing happened!" he says. "I thought, Well shit, that means I'm not supposed to die. I better get back to work."
I visited Millan at the ranch a few months after his suicide attempt. When I arrived he was lying on a bench in the shade, sweating through a purple polo shirt, with a bottle of Maalox resting on his chest. "I'm still managing the depression, the anger, the insecurity," he told me, "but I am moving forward." A pair of hyperactive huskies belonging to his close friend Jada Pinkett Smith ran through the hills pulling a sled Millan had modified for the rocky terrain. Junior, a sleek, gray three-year-old pit bull he was grooming to take Daddy's place, lay quietly under the bench, watching Millan's every move. "I couldn't have done what I do without Daddy," he said, "and now I can't do it without Junior. There's always a pit bull there supporting me."
Millan is a short, stocky guy – "like a burrito," he says – but he carries himself with a straight back, chest jutted out, a natural alpha. When he arrived in the United States 22 years ago, he knew only a single English word - "OK" - and he still talks in a loose, colloquial SoCal Spanglish, rolling through sentences with mixed-up tenses, calling his dog Blizzard a "Jello Lab," pronouncing buffet with a hard t and sushi as "su-chi." On 'Dog Whisperer,' Millan uses the language deficit to his advantage, putting clients at ease with his always polite, effortlessly funny broken-English banter as he (often painfully) dissects their troubled relationships with their dogs. In person he's just as charming – open, inquisitive, with a quick mind and a slightly rough edge that makes him even more likable. For all his alpha-male poise, Millan also possesses humility, which he says comes with the job. "In my field, working with animals, they detest egotistical people," he says. "Dogs are wise. They don't buy BS. . . . When you are egotistical, you're not grounded. So it's not even an option for me to become disconnected or lose my grounding."
All that summer, Millan spent his days at the ranch, clearing brush, digging roads, and planting trees. "Some people turn to cigarettes and alcohol when they have problems," he said. "I use hard work." When the sadness overwhelmed him, he would hike up the nearly vertical rim of the canyon – rocky, dry scrub thick with rattlesnakes – in heat that reached 115 degrees. If he didn't feel better when he got back down, he'd do it again.
One night, "I was sitting under this tree, right here," he said, pulling up in the Gator next to a giant Buddha statue, "and I was crying. I noticed the dogs started coming over, and they surrounded me. There were, like, 11 dogs all around, and they started to lick my face. Normally I don't like to be licked. I'm afraid of germs, but this was different. I had the sense that these dogs were healing me. From that night, I began to get stronger."
by Jason Fine, Men's Journal | Read more:
Photo via FanPop
When a Bough Breaks
A suburban playground on a cold winter’s day. A man in his early 30s, wearing a beanie, leather jacket and scarf, pushes a toddler on a swing, a dead look in his eyes. On the climbing frame, twins are jostling each other. Their mother stands underneath, hopping from foot to foot, her eyes darting from one girl to the next, issuing warnings, instructions; her voice rises anxiously in pitch. Looking around, I see only one adult smiling, but then she’s talking to her friend; their children are some way off, fighting each other with sticks.
There’s nothing particularly striking here. It could be any day of the week, in any town. And there’s nothing revelatory about the thought of parents secretly wishing they were anywhere else but the local playground, perhaps envying their childless friends; even wondering, during the sleepless nights, or in the aftermath of a fight with a recalcitrant teenager, why they had children at all. What is distinctive of our times is how few parents — still, even in our post-Freudian age — will openly admit to feelings of ambivalence towards their children. In an age where very little — from sex to money — is left a mystery, parental ambivalence remains one of the last taboos. (...)
But first, some definitions. In modern usage, ambivalence is often taken to mean having mixed feelings about something or someone. This, though, is a watering down of the concept. As developed by psychoanalysis, ambivalence refers to the fact that, in a single impulse, we can feel love and hate for the same person. It’s a potent, unpalatable idea; and in the grip of intense ambivalence we can feel overwhelmed and confused, as if a vicious civil war is underway inside us: no wonder we’d rather render it toothless. And yet, as any honest parent will tell you, this is often how it feels. Speak of it, though — as Lionel Shriver did in We Need to Talk About Kevin (2003), where Eva, the novel’s narrator, openly admits to deeply ambivalent feelings about her son Kevin — and you will face criticism, even ostracism, from those who would rather not believe that parents can ever harbour such feelings. The problem with Eva, of course, was not that she had ambivalent feelings towards her son, but that she dissembled throughout Kevin’s upbringing, pretending, through all her frantic biscuit-making, that all she felt for her cold, unlikeable son was love.
The question is why she, like so many parents, found it so hard to acknowledge her ambivalence, even to herself. Part of the reason must be that we all know — even if we’re not abreast of current statistics — that we live in a society in which shockingly high levels of violence are inflicted on children. According to the National Society for the Prevention of Cruelty to Children, one in four young adults is ‘severely maltreated’ during childhood, whether in the form of sexual, emotional, physical abuse or neglect. It’s a startlingly high figure, by anyone’s reckoning. And, if we acknowledge that we, too, sometimes have less than loving feelings towards our children; if we, too, sometimes have the wish to hurt, even if we are able to restrain ourselves, then does this mean that we too could be abusers? (...)
The paediatrician and psychoanalyst Donald Winnicott, who spent a lifetime working with children and families, understood why the scales of ambivalence might tip more towards hate than love. The baby, he wrote, ‘is a danger to her body in pregnancy and at birth’, he ‘is an interference with her private life’ and he ‘is ruthless, treats her as scum, an unpaid servant, a slave’. He ‘shows disillusionment about her’, he ‘refuses her good food… but eats well with his aunt’; then, having ‘got what he wants he throws her away like orange peel’. He ‘tries to hurt her’, and, ‘after an awful morning with him she goes out, and he smiles at a stranger, who says: “Isn’t he sweet?”’
And then there is the effect of the arrival of a third party — however planned and wished for — on a couple’s relationship. Nora Ephron, who wrote When Harry Met Sally… (1989), saw it explosively: the birth of a baby, she once said, was like ‘throwing a hand grenade into a marriage’. Lionel Shriver’s mother felt similarly, warning the author, then in her mid-30s and newly in love, that, should she and her partner decide to have a child, motherhood would ‘completely transform’ their relationship. ‘Though she did not spell it out,’ Shriver has written, ‘there was no question that she meant for the worse’. And yet many couples, finding themselves drifting apart, or fighting, opt to have a baby (or another baby) in the belief that this joint creation will restore their lost unity.
Fortunately, societal expectations are changing, albeit slowly. The feminist movement of the 1960s — typified by such books as Betty Friedan’s The Feminine Mystique (1963) — overturned long-held received wisdoms that designated motherhood (in the words of the social researcher Mary Georgina Boulton) as ‘intrinsically rewarding and not problematic’ and refocused attention on women’s actual experience of motherhood. Even so, Friedan set the blame for maternal ambivalence at society’s door, rather than acknowledging that, like paternal ambivalence, the very essence of the maternal role is contradictory, and the feelings roused in parents are equally powerful and often confusing.
Even now, when 21st-century mothers admit to ambivalence, as Rachel Cusk bravely did in her memoir A Life’s Work (2001), they are attacked as irresponsible, even unfit to be parents. And so we continue to enter parenthood blindly, relieved and proud that our genes will survive, and oblivious to the unrelenting demands ahead, or that we have unwittingly signed up for a job for life, with no training, pay, prospect of sabbatical leave, change of career or get-out clause. It’s a job that will require endless investment and patience and, if all doesn’t go too badly, one in which we are finally made redundant. Of course there are rewards, but these come fitfully and often when we least expect them.
by Edward Marriott, Aeon | Read more:
Illustration by Frank Adams
There’s nothing particularly striking here. It could be any day of the week, in any town. And there’s nothing revelatory about the thought of parents secretly wishing they were anywhere else but the local playground, perhaps envying their childless friends; even wondering, during the sleepless nights, or in the aftermath of a fight with a recalcitrant teenager, why they had children at all. What is distinctive of our times is how few parents — still, even in our post-Freudian age — will openly admit to feelings of ambivalence towards their children. In an age where very little — from sex to money — is left a mystery, parental ambivalence remains one of the last taboos. (...)
But first, some definitions. In modern usage, ambivalence is often taken to mean having mixed feelings about something or someone. This, though, is a watering down of the concept. As developed by psychoanalysis, ambivalence refers to the fact that, in a single impulse, we can feel love and hate for the same person. It’s a potent, unpalatable idea; and in the grip of intense ambivalence we can feel overwhelmed and confused, as if a vicious civil war is underway inside us: no wonder we’d rather render it toothless. And yet, as any honest parent will tell you, this is often how it feels. Speak of it, though — as Lionel Shriver did in We Need to Talk About Kevin (2003), where Eva, the novel’s narrator, openly admits to deeply ambivalent feelings about her son Kevin — and you will face criticism, even ostracism, from those who would rather not believe that parents can ever harbour such feelings. The problem with Eva, of course, was not that she had ambivalent feelings towards her son, but that she dissembled throughout Kevin’s upbringing, pretending, through all her frantic biscuit-making, that all she felt for her cold, unlikeable son was love.
The question is why she, like so many parents, found it so hard to acknowledge her ambivalence, even to herself. Part of the reason must be that we all know — even if we’re not abreast of current statistics — that we live in a society in which shockingly high levels of violence are inflicted on children. According to the National Society for the Prevention of Cruelty to Children, one in four young adults is ‘severely maltreated’ during childhood, whether in the form of sexual, emotional, physical abuse or neglect. It’s a startlingly high figure, by anyone’s reckoning. And, if we acknowledge that we, too, sometimes have less than loving feelings towards our children; if we, too, sometimes have the wish to hurt, even if we are able to restrain ourselves, then does this mean that we too could be abusers? (...)
The paediatrician and psychoanalyst Donald Winnicott, who spent a lifetime working with children and families, understood why the scales of ambivalence might tip more towards hate than love. The baby, he wrote, ‘is a danger to her body in pregnancy and at birth’, he ‘is an interference with her private life’ and he ‘is ruthless, treats her as scum, an unpaid servant, a slave’. He ‘shows disillusionment about her’, he ‘refuses her good food… but eats well with his aunt’; then, having ‘got what he wants he throws her away like orange peel’. He ‘tries to hurt her’, and, ‘after an awful morning with him she goes out, and he smiles at a stranger, who says: “Isn’t he sweet?”’
And then there is the effect of the arrival of a third party — however planned and wished for — on a couple’s relationship. Nora Ephron, who wrote When Harry Met Sally… (1989), saw it explosively: the birth of a baby, she once said, was like ‘throwing a hand grenade into a marriage’. Lionel Shriver’s mother felt similarly, warning the author, then in her mid-30s and newly in love, that, should she and her partner decide to have a child, motherhood would ‘completely transform’ their relationship. ‘Though she did not spell it out,’ Shriver has written, ‘there was no question that she meant for the worse’. And yet many couples, finding themselves drifting apart, or fighting, opt to have a baby (or another baby) in the belief that this joint creation will restore their lost unity.
Fortunately, societal expectations are changing, albeit slowly. The feminist movement of the 1960s — typified by such books as Betty Friedan’s The Feminine Mystique (1963) — overturned long-held received wisdoms that designated motherhood (in the words of the social researcher Mary Georgina Boulton) as ‘intrinsically rewarding and not problematic’ and refocused attention on women’s actual experience of motherhood. Even so, Friedan set the blame for maternal ambivalence at society’s door, rather than acknowledging that, like paternal ambivalence, the very essence of the maternal role is contradictory, and the feelings roused in parents are equally powerful and often confusing.
Even now, when 21st-century mothers admit to ambivalence, as Rachel Cusk bravely did in her memoir A Life’s Work (2001), they are attacked as irresponsible, even unfit to be parents. And so we continue to enter parenthood blindly, relieved and proud that our genes will survive, and oblivious to the unrelenting demands ahead, or that we have unwittingly signed up for a job for life, with no training, pay, prospect of sabbatical leave, change of career or get-out clause. It’s a job that will require endless investment and patience and, if all doesn’t go too badly, one in which we are finally made redundant. Of course there are rewards, but these come fitfully and often when we least expect them.
by Edward Marriott, Aeon | Read more:
Illustration by Frank Adams
Overwhelmed and Creeped Out
Blendr is the most high-profile of a series of new location-based dating apps for straight people. It was created by the same folks who made Grindr, the hookup app that’s become ubiquitous in the gay community. In June, Grindr announced it now has four and a half million users (six hundred thousand of them in the U.S.), and that they spend an average of ninety minutes browsing every single day. Contrast Grindr’s success with that of Blendr: the founders weren’t willing to disclose the number of users, opting instead to send me an anodyne statement that they “are thrilled with the pace of Blendr’s growth,” which, they say, “was faster in the first six months of launch than Grindr’s adoption rate during its first six months.” The company declined to say how many of those users are actually, well, using the app. If my own reaction is any indication, it’s no wonder. After my initial session, I only opened the app to show it to friends, scrolling through pages and pages of unappealing men in what resembled a masochistic digital-age performance-art piece titled “Why I’m Single.”
In truth, though, I tried Blendr not to find love, but at the behest of a bevy of Web developers. Around the time that Blendr launched in September, 2011, I wrote a short article declaring that the app was destined to fail. I argued that it didn’t take seriously the concerns of women—safety, proximity, control—even though the founder Joel Simkhai told GQ, “As a gay man, I probably understand straight women more than straight guys do.” Yeah, but probably not enough. Since airing my skepticism, I’ve received an e-mail or Facebook message every couple of months from a male entrepreneur who wants to pick my brain about how to make a location-based dating app appeal to women. “Blendr is generally useless, and there is a huge, untapped market for a hookup app for straights (or everyone other than gay men, really),” one of them wrote to me. “Attitudes towards sex have shifted massively in the past decade or so, not just amongst young people.”
And not just among men. But you wouldn’t know it by looking at the founders of every major dating start-up. From the Web-based heavy hitters like OkCupid, eHarmony, and Plenty of Fish on down to newer apps like Skout, How About We, and MeetMoi, they’re all developed by men. This might not seem like a big deal, until you consider one read on why Grindr has been so successful: the app has a “for us by us” appeal to gay men. But when it comes to heterosexual-dating technology, all-male co-founders represent the wants and needs of only half of their target audience. Sure, they can try to focus-group their way out of the problem, but if an app for “straight” people is to get anywhere close to Grindr’s level of success, women have to not just join out of curiosity. They have to actually use it.
by Ann Friedman, New Yorker | Read more:
Illustration by Istvan BanyaiReposts
[ed. Sometimes, the game trails just peter out and end up in the bushes, with nothing much new to report. When that happens, I think I'll just randomly select some previous posts and introduce new visitors to the Duck Soup archives. Feel free to explore on your own.]
How to Have Rational Discussion
The Elements of Style
Always Strive to Be Polite
Spring is Just Around the Corner
Martha My Dear
Actors Acting Out
Dam Amazing
For Sorrow There is No Remedy
Wednesday, February 27, 2013
Welcome to the Coldscape
More than three-quarters of the food consumed in the United States today is processed, packaged, shipped, stored, and sold under artificial refrigeration. The shiny, humming stainless steel box in your kitchen is just the tip of the iceberg, so to speak—a tiny fragment of the vast global network of temperature-controlled storage and distribution warehouses cumulatively capable of hosting uncounted billions of cubic feet of chilled flesh, fish, or fruit. Add to that an equally vast and immeasurable volume of thermally controlled space in the form of shipping containers, wine cellars, floating fish factories, international seed banks, meat-aging lockers, and livestock semen storage, and it becomes clear that the evolving architecture of coldspace is as ubiquitous as it is varied, as essential as it is overlooked.
J. M. Gorrie, a Florida doctor, was awarded the first US patent for mechanical refrigeration in 1851, with a device intended to cool cities rather than popsicles. Held back by heavy opposition led by the powerful natural-ice trade, not to mention the technical challenges that made early coldspaces risky as well as expensive propositions, artificial refrigeration for food only snowballed in the first half of the twentieth century, alongside the invention of plastic wrap and the introduction of self-serve supermarkets. Its story is central to every aspect of our national postwar narrative: the widespread entry of women into the workforce, the rise of suburban living, and the reshaping of the American landscape by the automobile. Gradually, at first, but now completely, in the United States—the first refrigerated nation—and then beyond, a network of artificially chilled warehouses, cabinets, and reefer fleets have elided place and time, reshaping both markets and cities with the promise of a more rational food supply and an end to decay, waste, and disease.
Despite the efforts of industry bodies, government agencies, and industrial archaeologists, this vast, distributed artificial winter that has reshaped our entire food system remains, for the most part, unmapped. What’s more, the varied forms of these cold spaces remain a mystery to most. This guide provides an introduction to a handful of the strange spatial typologies found within the “cold chain,” that linked network of atmospheric regulation on which our entire way of life depends.
These are spaces in which a perpetual winter has distorted or erased seasonality; spaces that are located within an energy-intensive geography of previously unimaginable distance—both mental and physical—between producers and consumers. Artificial refrigeration has reconfigured the contents of our plates and the shape of our cities—it has even contributed to the overthrow of governments, as anyone familiar with the rise and fall of United Fruit can attest. Perhaps most bizarrely, although their variations in form reflect the particular requirements of the perishable product they host, coldspaces have, in turn, redesigned food itself, both in terms of the selective breeding that favors cold-tolerance over taste and the more fundamental transition from food as daily nourishment to food as global commodity.
Welcome to the coldscape: the unobtrusive architecture of man’s unending struggle against time, distance, and entropy itself.
J. M. Gorrie, a Florida doctor, was awarded the first US patent for mechanical refrigeration in 1851, with a device intended to cool cities rather than popsicles. Held back by heavy opposition led by the powerful natural-ice trade, not to mention the technical challenges that made early coldspaces risky as well as expensive propositions, artificial refrigeration for food only snowballed in the first half of the twentieth century, alongside the invention of plastic wrap and the introduction of self-serve supermarkets. Its story is central to every aspect of our national postwar narrative: the widespread entry of women into the workforce, the rise of suburban living, and the reshaping of the American landscape by the automobile. Gradually, at first, but now completely, in the United States—the first refrigerated nation—and then beyond, a network of artificially chilled warehouses, cabinets, and reefer fleets have elided place and time, reshaping both markets and cities with the promise of a more rational food supply and an end to decay, waste, and disease.
Despite the efforts of industry bodies, government agencies, and industrial archaeologists, this vast, distributed artificial winter that has reshaped our entire food system remains, for the most part, unmapped. What’s more, the varied forms of these cold spaces remain a mystery to most. This guide provides an introduction to a handful of the strange spatial typologies found within the “cold chain,” that linked network of atmospheric regulation on which our entire way of life depends.
These are spaces in which a perpetual winter has distorted or erased seasonality; spaces that are located within an energy-intensive geography of previously unimaginable distance—both mental and physical—between producers and consumers. Artificial refrigeration has reconfigured the contents of our plates and the shape of our cities—it has even contributed to the overthrow of governments, as anyone familiar with the rise and fall of United Fruit can attest. Perhaps most bizarrely, although their variations in form reflect the particular requirements of the perishable product they host, coldspaces have, in turn, redesigned food itself, both in terms of the selective breeding that favors cold-tolerance over taste and the more fundamental transition from food as daily nourishment to food as global commodity.
Welcome to the coldscape: the unobtrusive architecture of man’s unending struggle against time, distance, and entropy itself.
by Nicola Twilley, Cabinet | Read more:
Photo: Nicola Twilley
Pathologising the Norm
One in four of us will struggle with a mental illness this year, the most common being depression and anxiety. The upcoming publication of the fifth edition of the Diagnostic and Statistical Manual for Mental Disorders (DSM) will expand the list of psychiatric classifications, further increasing the number of people who meet criteria for disorder. But will this increase in diagnoses really mean more people are getting the help they need? And to what extent are we pathologising normal human behaviours, reactions and mood swings?
The revamping of the DSM – an essential tool for mental health practitioners and researchers alike, often referred to as the ‘psychiatry bible’ – is long overdue; the previous version was published in 1994. This revision provides an excellent opportunity to scrutinise what qualifies as psychiatric illness and the criteria used to make these diagnoses. But will the experts make the right calls?
The complete list of new diagnoses was released recently and included controversial disorders such as ‘excessive bereavement after a loss’ and ‘internet use gaming disorder’. The inclusion of these syndromes raises the important question of what actually qualifies as pathology. Are we really helping more people by expanding these diagnostic criteria, discovering problems that were always there but previously unaddressed, or are we just creating new problems that now need to be treated? Moreover, the crucial questions of what these treatments entail and who will really benefit from them needs to be asked, not only for these new diagnoses but for our mental health care system as a whole.
There has been an explosion in psychiatric diagnoses over the last 30 years, due in large part to a change in ethos in the treatment and research of mental illness. This began in 1980 with the publication of the DSM-III, the first major revision of the manual to consider psychiatric disorders as physical diseases with biological origins, rather than mental illnesses stemming from intra- and interpersonal roots. This shift coincided with the development of the first effective psychiatric drugs (e.g. Prozac), thus enabling psychiatrists to prescribe medicine in treatment rather than relying on cognitive or psychoanalytic talk therapies. Since then, psychiatric diagnoses have more than doubled in the last 25 years, with this trend especially prominent in childhood disorders.
Attention deficit hyperactivity disorder (ADHD) is particularly exemplary of this phenomenon, with diagnoses skyrocketing over the last 10 years, up 66 percent in the United States since 2000. This may be partially due to a recent change in the diagnostic guidelines from theAmerican Academy of Pediatrics, suggesting that children as young as 4 and as old as 18 be screened and treated for the condition (ADHD was previously only diagnosed in children aged 6-12). Widening this age gap may enable parents to start seeking help for a troubled child earlier on, or include adolescents and adults previously thought to be too old to have ADHD, thus contributing to the increase in numbers. However, it is unlikely that this age change alone explains the ADHD boom.
Accounts by parents and clinicians alike suggest that the more common diagnoses of ADHD become, the easier they are to obtain. The spread of the disorder seems to have taken on epidemic proportions, stretching across geographical and socio-cultural boundaries. As such, acquiring a classification of ADHD has become remarkably easy. Diagnoses are made based on a clinician’s observations and subjective self-reports from the child, alongside comments and complaints from teachers and parents. There is no chemical or objective diagnostic test to identify ADHD, just as there are no such tests for the vast majority of psychiatric disorders. Clinicians must instead base their decisions on the symptoms described by the patient and his or her parents, matching their complaints to the criteria listed in the DSM. While this practice can be seen as progressive, giving those in need easier access to the treatments they require, it can also result in the undesirable consequence of widespread over-diagnosis in those who would not have originally qualified for the disorder.
This increase has also resulted in a dramatic rise in the number of prescriptions for psychostimulant medications used to treat ADHD, up 375% as of 2003. Pathologising and subsequently prescribing medication to help control a ‘problem child’ is a worrying side effect of the broadening of diagnostic criteria. A concerning trend has emerged for parents to give their children psychostimulant medication to treat inattention or hyperactivity in school, without an official diagnosis of ADHD. Clinicians willing to go along with this practice believe that these sub-threshold children can benefit from the calming and focusing effects of the drugs, and that they will help improve their academic performance. However, this seems a highly dubious practice, as the referring clinician may have a financial investment in writing these prescriptions, receiving perks or consulting fees from the very drug companies whose medications they are prescribing. This conflict of interest can significantly contribute to the free-flowing prescriptions for psychoactive medications, particularly in cases where the full-fledged diagnosis of ADHD is not warranted. (...)
An important question that needs to be raised regarding these recent increases in psychiatric diagnoses is what role does the multi-billion pound pharmaceutical industry have in this trend? With the rise in diagnoses comes a spike in prescriptions, poising pharmaceutical companies to make millions off the expansion of these varying diagnostic criteria.
This is particularly applicable in the recent changes made to the qualifications for clinical depression, with the DSM deciding to drop the exclusion of bereavement in its classification of the disorder. This means that individuals going through the natural grieving process following the loss of a loved one can now be prescribed anti-depressant drugs to help them cope. In the previous version of the DSM, only cases of ‘excessive bereavement’ (severe depressive symptoms lasting longer than eight weeks) were covered under the criteria for depression, enabling those who needed help to receive it, but without truncating or pathologising the normal grieving process. However, doing away with the time restriction and enabling those experiencing sadness immediately following the death of a loved one to receive pharmaceutical treatment unnecessarily medicalises this process. Additionally, the antidepressants prescribed in these situations (usually serotonin reuptake inhibitors, or SSRIs) are not entirely innocuous, and can be accompanied by unpleasant side effects. Furthermore, SSRIs can take up to four weeks to have full effect, and prescriptions usually last for several months. Thus, the immediate benefit to the patient when they need it the most would be limited, and it is likely they would be on the medication for longer than necessary.
The Washington Post recently investigated the decision behind this change and discovered that 8 of the 11 members on the board of the APA, the American Psychiatric Association, who were responsible for the revisions to the DSM have various financial ties to several different pharmaceutical companies. These include owning stock options, receiving consultation fees, and obtaining grant funding from the industry. These conflicts of interest can create serious potential bias in those members to best serve the financial interests of these companies; and in the current dilemma regarding the medicalisation of bereavement, the potential increase in profit from the rise in prescriptions is tremendous, almost undoubtedly influencing the decisions of those on the board. Furthermore, one of the chief advisors to the committee was the lead author of a study promoting Wellbutrin, an antidepressant drug developed by GlaxoWellcome, as an effective treatment for the alleviation of depressive symptoms following the loss of a loved one. The consultation from this individual, who could benefit both personally and professionally from such a change, was clearly biased in this situation, and most likely ended up swaying the board’s decision to its present outcome.
The revamping of the DSM – an essential tool for mental health practitioners and researchers alike, often referred to as the ‘psychiatry bible’ – is long overdue; the previous version was published in 1994. This revision provides an excellent opportunity to scrutinise what qualifies as psychiatric illness and the criteria used to make these diagnoses. But will the experts make the right calls?
The complete list of new diagnoses was released recently and included controversial disorders such as ‘excessive bereavement after a loss’ and ‘internet use gaming disorder’. The inclusion of these syndromes raises the important question of what actually qualifies as pathology. Are we really helping more people by expanding these diagnostic criteria, discovering problems that were always there but previously unaddressed, or are we just creating new problems that now need to be treated? Moreover, the crucial questions of what these treatments entail and who will really benefit from them needs to be asked, not only for these new diagnoses but for our mental health care system as a whole.
There has been an explosion in psychiatric diagnoses over the last 30 years, due in large part to a change in ethos in the treatment and research of mental illness. This began in 1980 with the publication of the DSM-III, the first major revision of the manual to consider psychiatric disorders as physical diseases with biological origins, rather than mental illnesses stemming from intra- and interpersonal roots. This shift coincided with the development of the first effective psychiatric drugs (e.g. Prozac), thus enabling psychiatrists to prescribe medicine in treatment rather than relying on cognitive or psychoanalytic talk therapies. Since then, psychiatric diagnoses have more than doubled in the last 25 years, with this trend especially prominent in childhood disorders.
Attention deficit hyperactivity disorder (ADHD) is particularly exemplary of this phenomenon, with diagnoses skyrocketing over the last 10 years, up 66 percent in the United States since 2000. This may be partially due to a recent change in the diagnostic guidelines from theAmerican Academy of Pediatrics, suggesting that children as young as 4 and as old as 18 be screened and treated for the condition (ADHD was previously only diagnosed in children aged 6-12). Widening this age gap may enable parents to start seeking help for a troubled child earlier on, or include adolescents and adults previously thought to be too old to have ADHD, thus contributing to the increase in numbers. However, it is unlikely that this age change alone explains the ADHD boom.
Accounts by parents and clinicians alike suggest that the more common diagnoses of ADHD become, the easier they are to obtain. The spread of the disorder seems to have taken on epidemic proportions, stretching across geographical and socio-cultural boundaries. As such, acquiring a classification of ADHD has become remarkably easy. Diagnoses are made based on a clinician’s observations and subjective self-reports from the child, alongside comments and complaints from teachers and parents. There is no chemical or objective diagnostic test to identify ADHD, just as there are no such tests for the vast majority of psychiatric disorders. Clinicians must instead base their decisions on the symptoms described by the patient and his or her parents, matching their complaints to the criteria listed in the DSM. While this practice can be seen as progressive, giving those in need easier access to the treatments they require, it can also result in the undesirable consequence of widespread over-diagnosis in those who would not have originally qualified for the disorder.
This increase has also resulted in a dramatic rise in the number of prescriptions for psychostimulant medications used to treat ADHD, up 375% as of 2003. Pathologising and subsequently prescribing medication to help control a ‘problem child’ is a worrying side effect of the broadening of diagnostic criteria. A concerning trend has emerged for parents to give their children psychostimulant medication to treat inattention or hyperactivity in school, without an official diagnosis of ADHD. Clinicians willing to go along with this practice believe that these sub-threshold children can benefit from the calming and focusing effects of the drugs, and that they will help improve their academic performance. However, this seems a highly dubious practice, as the referring clinician may have a financial investment in writing these prescriptions, receiving perks or consulting fees from the very drug companies whose medications they are prescribing. This conflict of interest can significantly contribute to the free-flowing prescriptions for psychoactive medications, particularly in cases where the full-fledged diagnosis of ADHD is not warranted. (...)
An important question that needs to be raised regarding these recent increases in psychiatric diagnoses is what role does the multi-billion pound pharmaceutical industry have in this trend? With the rise in diagnoses comes a spike in prescriptions, poising pharmaceutical companies to make millions off the expansion of these varying diagnostic criteria.
This is particularly applicable in the recent changes made to the qualifications for clinical depression, with the DSM deciding to drop the exclusion of bereavement in its classification of the disorder. This means that individuals going through the natural grieving process following the loss of a loved one can now be prescribed anti-depressant drugs to help them cope. In the previous version of the DSM, only cases of ‘excessive bereavement’ (severe depressive symptoms lasting longer than eight weeks) were covered under the criteria for depression, enabling those who needed help to receive it, but without truncating or pathologising the normal grieving process. However, doing away with the time restriction and enabling those experiencing sadness immediately following the death of a loved one to receive pharmaceutical treatment unnecessarily medicalises this process. Additionally, the antidepressants prescribed in these situations (usually serotonin reuptake inhibitors, or SSRIs) are not entirely innocuous, and can be accompanied by unpleasant side effects. Furthermore, SSRIs can take up to four weeks to have full effect, and prescriptions usually last for several months. Thus, the immediate benefit to the patient when they need it the most would be limited, and it is likely they would be on the medication for longer than necessary.
The Washington Post recently investigated the decision behind this change and discovered that 8 of the 11 members on the board of the APA, the American Psychiatric Association, who were responsible for the revisions to the DSM have various financial ties to several different pharmaceutical companies. These include owning stock options, receiving consultation fees, and obtaining grant funding from the industry. These conflicts of interest can create serious potential bias in those members to best serve the financial interests of these companies; and in the current dilemma regarding the medicalisation of bereavement, the potential increase in profit from the rise in prescriptions is tremendous, almost undoubtedly influencing the decisions of those on the board. Furthermore, one of the chief advisors to the committee was the lead author of a study promoting Wellbutrin, an antidepressant drug developed by GlaxoWellcome, as an effective treatment for the alleviation of depressive symptoms following the loss of a loved one. The consultation from this individual, who could benefit both personally and professionally from such a change, was clearly biased in this situation, and most likely ended up swaying the board’s decision to its present outcome.
Cinema Tarantino: The Making of Pulp Fiction
“Every major studio passed,” says Lawrence Bender. Then, says DeVito, “I gave it to the king, Harvey Weinstein.”
It went through Richard Gladstein, who was now at Miramax. Weinstein, who had recently merged Miramax with Disney in an $80 million deal, was walking out of his L.A. office on his way to catch a plane for a vacation on Martha’s Vineyard when Gladstein handed him the script. “What is this, the fucking telephone book?,” Weinstein asked him when he saw that it was 159 pages, the normal being 115. He lugged the script to the plane, however.
“He called me two hours later and said, ‘The first scene is fucking brilliant. Does it stay this good?’ ” remembers Gladstein. He called again an hour later, having read to the point where the main character, the hit man Vincent Vega, is shot and killed. “Are you guys crazy?” he yelled. “You just killed off the main character in the middle of the movie!”
“Just keep reading,” said Gladstein. “And Harvey says, ‘Start negotiating!’ So I did, and he called back shortly thereafter and said, ‘Are you closed yet?’ I said, ‘I’m into it.’ Harvey said, ‘Hurry up! We’re making this movie.’ ”
Disney may have seemed an unlikely match for Pulp Fiction, but Weinstein had the final say. “As for [then chairman] Jeffrey Katzenberg, that was the first test of what I call autonomy with Jeffrey,” says Weinstein. “When I signed my contract with Disney selling Miramax, with us still running the company, I wrote the word ‘autonomy’ on every page, because I had heard that Jeffrey was notorious for not giving it. When I read the Pulp Fiction script, I went to him and said, ‘Even though I have the right to make this, I want to clear it with you.’ He read it and said, ‘Easy on the heroin scene, if you can, but that is one of the best scripts I have ever read. Even though you don’t need it, I am giving you my blessing.’ ”
The script was sent out to actors with the warning “If you show this to anybody, two guys from Jersey [Films] will come and break your legs.”
by Mark Seal, Vanity Fair | Read more:
Photograph by Annie LeibovitzReinventing Society in the Wake of Big Data
Recently I seem to have become MIT's Big Data guy, with people like Tim O'Reilly and "Forbes" calling me one of the seven most powerful data scientists in the world. I'm not sure what all of that means, but I have a distinctive view about Big Data, so maybe it is something that people want to hear.
I believe that the power of Big Data is that it is information about people's behavior instead of information about their beliefs. It's about the behavior of customers, employees, and prospects for your new business. It's not about the things you post on Facebook, and it's not about your searches on Google, which is what most people think about, and it's not data from internal company processes and RFIDs. This sort of Big Data comes from things like location data off of your cell phone or credit card, it's the little data breadcrumbs that you leave behind you as you move around in the world.
What those breadcrumbs tell is the story of your life. It tells what you've chosen to do. That's very different than what you put on Facebook. What you put on Facebook is what you would like to tell people, edited according to the standards of the day. Who you actually are is determined by where you spend time, and which things you buy. Big data is increasingly about real behavior, and by analyzing this sort of data, scientists can tell an enormous amount about you. They can tell whether you are the sort of person who will pay back loans. They can tell you if you're likely to get diabetes.
They can do this because the sort of person you are is largely determined by your social context, so if I can see some of your behaviors, I can infer the rest, just by comparing you to the people in your crowd. You can tell all sorts of things about a person, even though it's not explicitly in the data, because people are so enmeshed in the surrounding social fabric that it determines the sorts of things that they think are normal, and what behaviors they will learn from each other.
As a consequence analysis of Big Data is increasingly about finding connections, connections with the people around you, and connections between people's behavior and outcomes. You can see this in all sorts of places. For instance, one type of Big Data and connection analysis concerns financial data. Not just the flash crash or the Great Recession, but also all the other sorts of bubbles that occur. What these are is these are systems of people, communications, and decisions that go badly awry. Big Data shows us the connections that cause these events. Big data gives us the possibility of understanding how these systems of people and machines work, and whether they're stable.
The notion that it is connections between people that is really important is key, because researchers have mostly been trying to understand things like financial bubbles using what is called Complexity Science or Web Science. But these older ways of thinking about Big Data leaves the humans out of the equation. What actually matters is how the people are connected together by the machines and how, as a whole, they create a financial market, a government, a company, and other social structures.
Because it is so important to understand these connections Asu Ozdaglar and I have recently created the MIT Center for Connection Science and Engineering, which spans all of the different MIT departments and schools. It's one of the very first MIT-wide Centers, because people from all sorts of specialties are coming to understand that it is the connections between people that is actually the core problem in making transportation systems work well, in making energy grids work efficiently, and in making financial systems stable. Markets are not just about rules or algorithms; they're about people and algorithms together.
Understanding these human-machine systems is what's going to make our future social systems stable and safe. We are getting beyond complexity, data science and web science, because we are including people as a key part of these systems. That's the promise of Big Data, to really understand the systems that make our technological society. As you begin to understand them, then you can build systems that are better. The promise is for financial systems that don't melt down, governments that don't get mired in inaction, health systems that actually work, and so on, and so forth.
The barriers to better societal systems are not about the size or speed of data. They're not about most of the things that people are focusing on when they talk about Big Data. Instead, the challenge is to figure out how to analyze the connections in this deluge of data and come to a new way of building systems based on understanding these connections.
I believe that the power of Big Data is that it is information about people's behavior instead of information about their beliefs. It's about the behavior of customers, employees, and prospects for your new business. It's not about the things you post on Facebook, and it's not about your searches on Google, which is what most people think about, and it's not data from internal company processes and RFIDs. This sort of Big Data comes from things like location data off of your cell phone or credit card, it's the little data breadcrumbs that you leave behind you as you move around in the world.
What those breadcrumbs tell is the story of your life. It tells what you've chosen to do. That's very different than what you put on Facebook. What you put on Facebook is what you would like to tell people, edited according to the standards of the day. Who you actually are is determined by where you spend time, and which things you buy. Big data is increasingly about real behavior, and by analyzing this sort of data, scientists can tell an enormous amount about you. They can tell whether you are the sort of person who will pay back loans. They can tell you if you're likely to get diabetes.
They can do this because the sort of person you are is largely determined by your social context, so if I can see some of your behaviors, I can infer the rest, just by comparing you to the people in your crowd. You can tell all sorts of things about a person, even though it's not explicitly in the data, because people are so enmeshed in the surrounding social fabric that it determines the sorts of things that they think are normal, and what behaviors they will learn from each other.
As a consequence analysis of Big Data is increasingly about finding connections, connections with the people around you, and connections between people's behavior and outcomes. You can see this in all sorts of places. For instance, one type of Big Data and connection analysis concerns financial data. Not just the flash crash or the Great Recession, but also all the other sorts of bubbles that occur. What these are is these are systems of people, communications, and decisions that go badly awry. Big Data shows us the connections that cause these events. Big data gives us the possibility of understanding how these systems of people and machines work, and whether they're stable.
The notion that it is connections between people that is really important is key, because researchers have mostly been trying to understand things like financial bubbles using what is called Complexity Science or Web Science. But these older ways of thinking about Big Data leaves the humans out of the equation. What actually matters is how the people are connected together by the machines and how, as a whole, they create a financial market, a government, a company, and other social structures.
Because it is so important to understand these connections Asu Ozdaglar and I have recently created the MIT Center for Connection Science and Engineering, which spans all of the different MIT departments and schools. It's one of the very first MIT-wide Centers, because people from all sorts of specialties are coming to understand that it is the connections between people that is actually the core problem in making transportation systems work well, in making energy grids work efficiently, and in making financial systems stable. Markets are not just about rules or algorithms; they're about people and algorithms together.
Understanding these human-machine systems is what's going to make our future social systems stable and safe. We are getting beyond complexity, data science and web science, because we are including people as a key part of these systems. That's the promise of Big Data, to really understand the systems that make our technological society. As you begin to understand them, then you can build systems that are better. The promise is for financial systems that don't melt down, governments that don't get mired in inaction, health systems that actually work, and so on, and so forth.
The barriers to better societal systems are not about the size or speed of data. They're not about most of the things that people are focusing on when they talk about Big Data. Instead, the challenge is to figure out how to analyze the connections in this deluge of data and come to a new way of building systems based on understanding these connections.
by Sandy Pentland, Edge | Read more:
Photo: uncredited
Fingermouse
Wearable accelerometers aren’t just for fitness trackers anymore. Newly founded Innovative Developments is releasing Mycestro, a wearable “fingermouse,” via Kickstarter.
It’s more than just an alternative to the optical mouse, though. Mycestro is a user interface tool that enables gesture control without the arm-fatigue issues of Minority Report-style motion tracking. It changes how you interact with your desktop and, by offering new ways to control them, could even change how those desktops are designed in the first place.
Built to slip on an index finger and track the wearer’s movements, the Mycestro allows the wearer to move the cursor without reaching for a mouse, and joins a growing cohort of wearable devices, says creator Nick Mastandrea.
“It’s a well-rounded device,” he says. “It’s actually a little bit on the simplistic side. But the application, how we’re using it and how you can interface to it, is all transitioning towards the new, evolved, high-tech person.”
A user wearing Mycestro touches her thumb to it to engage the cursor, taps her finger to click, and slides her thumb along the device to scroll. Mycestro uses a gyroscope to track positioning in 3-D space, translating that to the 2-D screen via the integrated app, and registers other functions, like tap-to-click, via a built-in accelerometer. Data from the accelerometer also helps correct the gyroscope, improving precision control, and the whole system is coordinated by Bluetooth low energy.
It’s so precise, in fact, that with the sensitivity turned up, it can register over-caffeinated coffee jitters. But Mastandrea adds that software updates will be able to track involuntary user movements over time and compensate to filter them out, a technology which could eventually be modified to help users with neurological diseases like Parkinson’s operate a computer more easily.
It’s more than just an alternative to the optical mouse, though. Mycestro is a user interface tool that enables gesture control without the arm-fatigue issues of Minority Report-style motion tracking. It changes how you interact with your desktop and, by offering new ways to control them, could even change how those desktops are designed in the first place.
Built to slip on an index finger and track the wearer’s movements, the Mycestro allows the wearer to move the cursor without reaching for a mouse, and joins a growing cohort of wearable devices, says creator Nick Mastandrea.
“It’s a well-rounded device,” he says. “It’s actually a little bit on the simplistic side. But the application, how we’re using it and how you can interface to it, is all transitioning towards the new, evolved, high-tech person.”
A user wearing Mycestro touches her thumb to it to engage the cursor, taps her finger to click, and slides her thumb along the device to scroll. Mycestro uses a gyroscope to track positioning in 3-D space, translating that to the 2-D screen via the integrated app, and registers other functions, like tap-to-click, via a built-in accelerometer. Data from the accelerometer also helps correct the gyroscope, improving precision control, and the whole system is coordinated by Bluetooth low energy.
It’s so precise, in fact, that with the sensitivity turned up, it can register over-caffeinated coffee jitters. But Mastandrea adds that software updates will be able to track involuntary user movements over time and compensate to filter them out, a technology which could eventually be modified to help users with neurological diseases like Parkinson’s operate a computer more easily.
by Nathan Hurst, Wired | Read more:
Photo: Innovative DevelopmentTuesday, February 26, 2013
Sorry Not Sorry
Thus was Lance Armstrong pilloried in our virtual town square. The disgraced cyclist sat with Oprah, the infallible high priestess of our national church, to be rehabilitated by her divine grace. But Armstrong didn’t play by the rules; he apologized without submitting. This is just not done: we want icons to fall into abject ruin and then plead for redemption – to throw themselves on the mercy of the court so we may assess their bathetic supplication. The initial transgression that lets us show our generous clemency must, for this reason, be forgivable. But we cannot forgive the real transgression: the false confession or suspect apology.
A notorious recent non-apology illustrates this pattern. Summoned to the couch after his 2011 hookers’nblow blow-up Charlie Sheen also goes off-script. Refusing to play his part, he fully inhabits his excess, “owning” his narcissistic indulgence, even upping the ante. He baffles his inquisitors, intensifying their lust to repress him. One exemplary interview shows Sheen riding the tense line between assimilation and expulsion:
Sheen’s chirpy but steely-eyed interviewer leads off with: “Your anger and your hate is coming off as erratic.” But Sheen quickly corrects her: “My passion, my passion.” Wait, what? You’re not confessing? She bears down: “When’s the last time you used?” she demands. Sheen scoffs, “I use a blender, I use a vacuum cleaner.” “You’re clean right now, and so is this better now, your life now, clean, with your children?” At least denounce drugs – tell the nation you prefer your children to drugs! Admit you are better sober, concede you were insane before: confess! “It’s not about better,” Sheen calmly explains, “it’s just different. It doesn’t compare, they’re different realities.” When he explicitly endorses his experiences with drugs and gives his reasons, she ignores him and vertiginously asks, “When you look back on the last time you used drugs, are you disgusted with yourself?” This is the non sequitur of the fundamentalist deafened by her cause. Sheen replies: “I’m proud of what I created, it was radical.”
So the script got cracked, shredded by an ad-libbing lunatic, the nation’s confessional desecrated. This is not to endorse Sheen; the domestic abuse is odious, the one-man show abysmal, the sitcom an idiotic self-parody. Rather, this interview is what happens when the role of the propitiator is miscast. Sheen’s recalcitrance exposes the spectacle of social repression, his psychosis explodes the charade, turns it inside out: he can be neither normalized nor dismissed. Sheen stalks a tenebrous boundary: he must feel the repressive-generative public desire bearing down on him, forcing on him the stark choice: insanity or clemency.
Hardly just some occasional, amusing pop-cultural publicity stunt benefiting captor and captive alike, this failed confession recapitulates the logic of the police station. One typical how-to textbook on criminal interrogation, echoing Sheen’s cross-examination, “describe[s] in vivid detail a nine-step procedure designed to overcome the resistance of reluctant suspects:
Using this procedure, the interrogator begins by confronting the suspect with his or her guilt (Step 1): develops psychological “themes” that justify or excuse the crime (2); interrupts all statements of denial (3); overcomes the suspect’s factual, moral, and emotional objections to the charges (4); ensures that the increasingly passive suspect does not tune out (5); shows sympathy and understanding and urges the suspect to tell the truth (6); offers the suspect a face-saving alternative explanation for his or her guilty action (7); gets the suspect to recount the details of the crime (8); and converts that statement into a full written confession (9).
The coerciveness of the process is transparent when someone rebels, as Sheen did. But it is more insidious and layered when someone like Lance Armstrong complies. His ambivalent, semi-deferent apology exposes the paradox of the public confession. The accusation, inquisition, confession, and pardon must be grave enough to raise and satisfy our demand for retribution, but vacuous enough to remain a ritualistic ethical performance. The sin must be offensive enough to call for real punishment – and not some measly admission – while the confession must be vapid enough to prevent critical resistance. Hence the symbolic process is perverse: it must be heavy enough to coerce but light enough to entertain. The absolution process has the discipline and tact to shield the brute venality of the interrogation – that is, to focus only on the sinner’s discrete violations. Oprah plays her part, but Lance cannot rise to the occasion, and hence risks exposing the entire charade.
by Elliott Prasse-Freeman and Sayres Rudy, The New Inquiry | Read more:
Image: uncredited
It’s For Your Own Good!
Many Americans abhor paternalism. They think that people should be able to go their own way, even if they end up in a ditch. When they run risks, even foolish ones, it isn’t anybody’s business that they do. In this respect, a significant strand in American culture appears to endorse the central argument of John Stuart Mill’sOn Liberty. In his great essay, Mill insisted that as a general rule, government cannot legitimately coerce people if its only goal is to protect people from themselves. Mill contended that
Mill offered a number of independent justifications for his famous harm principle, but one of his most important claims is that individuals are in the best position to know what is good for them. In Mill’s view, the problem with outsiders, including government officials, is that they lack the necessary information. Mill insists that the individual “is the person most interested in his own well-being,” and the “ordinary man or woman has means of knowledge immeasurably surpassing those that can be possessed by any one else.”
When society seeks to overrule the individual’s judgment, Mill wrote, it does so on the basis of “general presumptions,” and these “may be altogether wrong, and even if right, are as likely as not to be misapplied to individual cases.” If the goal is to ensure that people’s lives go well, Mill contends that the best solution is for public officials to allow people to find their own path. Here, then, is an enduring argument, instrumental in character, on behalf of free markets and free choice in countless situations, including those in which human beings choose to run risks that may not turn out so well.. (...)
Emphasizing these and related behavioral findings, many people have been arguing for a new form of paternalism, one that preserves freedom of choice, but that also steers citizens in directions that will make their lives go better by their own lights. (Full disclosure: the behavioral economist Richard Thaler and I have argued on behalf of what we call libertarian paternalism, known less formally as “nudges.”) For example, cell phones, computers, privacy agreements, mortgages, and rental car contracts come with default rules that specify what happens if people do nothing at all to protect themselves. Default rules are a classic nudge, and they matter because doing nothing is exactly what people will often do. Many employees have not signed up for 401(k) plans, even when it seems clearly in their interest to do so. A promising response, successfully increasing participation and strongly promoted by President Obama, is to establish a default rule in favor of enrollment, so that employees will benefit from retirement plans unless they opt out. In many situations, default rates have large effects on outcomes, indeed larger than significant economic incentives.
Default rules are merely one kind of “choice architecture,” a phrase that may refer to the design of grocery stores, for example, so that the fresh vegetables are prominent; the order in which items are listed on a restaurant menu; visible official warnings; public education campaigns; the layout of websites; and a range of other influences on people’s choices. Such examples suggest that mildly paternalistic approaches can use choice architecture in order to improve outcomes for large numbers of people without forcing anyone to do anything.
In the United States, behavioral findings have played an unmistakable part in recent regulations involving retirement savings, fuel economy, energy efficiency, environmental protection, health care, and obesity. In the United Kingdom, Prime Minister David Cameron has created a Behavioural Insights Team, sometimes known as the Nudge Unit, with the specific goal of incorporating an understanding of human behavior into policy initiatives. In short, behavioral economics is having a large impact all over the world, and the emphasis on human error is raising legitimate questions about the uses and limits of paternalism.
the only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others. His own good, either physical or mental, is not a sufficient warrant. He cannot rightfully be compelled to do or forbear because it will be better for him to do so, because it will make him happier, because, in the opinion of others, to do so would be wise, or even right.A lot of Americans agree. In recent decades, intense controversies have erupted over apparently sensible (and lifesaving) laws requiring people to buckle their seatbelts. When states require motorcyclists to wear helmets, numerous people object. The United States is facing a series of serious disputes about the boundaries of paternalism. The most obvious example is the “individual mandate” in the Affordable Care Act, upheld by the Supreme Court by a 5–4 vote, but still opposed by many critics, who seek to portray it as a form of unacceptable paternalism. There are related controversies over anti-smoking initiatives and the “food police,” allegedly responsible for recent efforts to reduce the risks associated with obesity and unhealthy eating, including nutrition guidelines for school lunches.
Mill offered a number of independent justifications for his famous harm principle, but one of his most important claims is that individuals are in the best position to know what is good for them. In Mill’s view, the problem with outsiders, including government officials, is that they lack the necessary information. Mill insists that the individual “is the person most interested in his own well-being,” and the “ordinary man or woman has means of knowledge immeasurably surpassing those that can be possessed by any one else.”
When society seeks to overrule the individual’s judgment, Mill wrote, it does so on the basis of “general presumptions,” and these “may be altogether wrong, and even if right, are as likely as not to be misapplied to individual cases.” If the goal is to ensure that people’s lives go well, Mill contends that the best solution is for public officials to allow people to find their own path. Here, then, is an enduring argument, instrumental in character, on behalf of free markets and free choice in countless situations, including those in which human beings choose to run risks that may not turn out so well.. (...)
Emphasizing these and related behavioral findings, many people have been arguing for a new form of paternalism, one that preserves freedom of choice, but that also steers citizens in directions that will make their lives go better by their own lights. (Full disclosure: the behavioral economist Richard Thaler and I have argued on behalf of what we call libertarian paternalism, known less formally as “nudges.”) For example, cell phones, computers, privacy agreements, mortgages, and rental car contracts come with default rules that specify what happens if people do nothing at all to protect themselves. Default rules are a classic nudge, and they matter because doing nothing is exactly what people will often do. Many employees have not signed up for 401(k) plans, even when it seems clearly in their interest to do so. A promising response, successfully increasing participation and strongly promoted by President Obama, is to establish a default rule in favor of enrollment, so that employees will benefit from retirement plans unless they opt out. In many situations, default rates have large effects on outcomes, indeed larger than significant economic incentives.
Default rules are merely one kind of “choice architecture,” a phrase that may refer to the design of grocery stores, for example, so that the fresh vegetables are prominent; the order in which items are listed on a restaurant menu; visible official warnings; public education campaigns; the layout of websites; and a range of other influences on people’s choices. Such examples suggest that mildly paternalistic approaches can use choice architecture in order to improve outcomes for large numbers of people without forcing anyone to do anything.
In the United States, behavioral findings have played an unmistakable part in recent regulations involving retirement savings, fuel economy, energy efficiency, environmental protection, health care, and obesity. In the United Kingdom, Prime Minister David Cameron has created a Behavioural Insights Team, sometimes known as the Nudge Unit, with the specific goal of incorporating an understanding of human behavior into policy initiatives. In short, behavioral economics is having a large impact all over the world, and the emphasis on human error is raising legitimate questions about the uses and limits of paternalism.
by Cass R. Sunstein, NY Review of Books | Read more:
Image:ConsumerFreedom.comMonday, February 25, 2013
Joe Zawinul and The Zawinul Syndicate
[ed. Yea! Got my router working again and it's music night (in celebration of our second anniversary). Enjoy.]
Your Guide to Life Under the Copyright Alert System
The Copyright Alerts System (CAS) is set to go live soon [ed. next week]. If you're an executive at a major content company like Fox or Universal, you might consider this good news.
Image by Jason Reed
For everyone else, well, it's complicated. The CAS has been in the works for years now—you may have heard it mentioned under other names, like the “Six Strikes” system—so it can be kind of a mess to make sense of it now that it's about to take effect.
We've put together a quick guide to give you a sense of the new state of the Internet in the U.S.
What is the CAS?
It's an agreement between the five largest Internet service providers (ISPs) in the country and major content companies, such as music studios and record labels. It aims to cut down on illegal filesharing.
Does it directly affect me?
If you use AT&T, Cablevision, Comcast, Time Warner, or Verizon to get online at home, then yes. It's either now part of your terms of service or it's about to be.
How does it work?
It's an automated "graduated response" system, meaning it slowly ramps up your punishments each time it thinks you're pirating files. The first two times, you just receive an email and a voicemail saying you've been caught. The third and fourth times, you're redirected to some "educational" material, and you'll have to click that you understood it. The fifth and sixth times, it gets serious: Your Internet connection can be slowed to a crawl for a few days.
Then what happens?
Then, well, you've "graduated" from the system. No more alerts. Congrats! The CAS won't hamper you any more. Except the content companies might now try to sue you as a serial pirate. And the fact that you've been cited six times already for copyright infringement will likely be used in court against you.
by Kevin Collier, Daily Dot | Read more:We've put together a quick guide to give you a sense of the new state of the Internet in the U.S.
What is the CAS?
It's an agreement between the five largest Internet service providers (ISPs) in the country and major content companies, such as music studios and record labels. It aims to cut down on illegal filesharing.
Does it directly affect me?
If you use AT&T, Cablevision, Comcast, Time Warner, or Verizon to get online at home, then yes. It's either now part of your terms of service or it's about to be.
How does it work?
It's an automated "graduated response" system, meaning it slowly ramps up your punishments each time it thinks you're pirating files. The first two times, you just receive an email and a voicemail saying you've been caught. The third and fourth times, you're redirected to some "educational" material, and you'll have to click that you understood it. The fifth and sixth times, it gets serious: Your Internet connection can be slowed to a crawl for a few days.
Then what happens?
Then, well, you've "graduated" from the system. No more alerts. Congrats! The CAS won't hamper you any more. Except the content companies might now try to sue you as a serial pirate. And the fact that you've been cited six times already for copyright infringement will likely be used in court against you.
Image by Jason Reed
In Pursuit of Taste, en Masse
Americans didn’t always ask so many questions or expect so much in their quest for enjoyment. It was enough for them simply to savor a good cigar, a nice bottle of wine or a tasty morsel of cheese.
Not anymore. Driven by a relentless quest for “the best,” we increasingly see every item we place in our grocery basket or Internet shopping cart as a reflection of our discrimination and taste. We are not consumers. We have a higher calling. We are connoisseurs.
Connoisseurship has never been more popular. Long confined to the serious appreciation of high art and classical music, it is now applied to an endless cascade of pursuits. Leading publications, including The New York Times, routinely discuss the connoisseurship of coffee, cupcakes and craft beers; of cars, watches, fountain pens, lunchboxes, stereo systems and computers; of tacos, pizza, pickles, chocolate, mayonnaise, cutlery and light (yes, light, which is not to be confused with the specialized connoisseurship of lighting). And the Grateful Dead, of course.
This democratization of connoisseurship is somewhat surprising since as recently as the social upheavals of the 1960s and ’70s connoisseurship was a “dirty word” — considered “elitist, artificial, subjective and mostly imaginary,” said Laurence B. Kanter, chief curator of the Yale University Art Gallery. Today, it is a vital expression of how many of us we want to see, and distinguish, ourselves.
As its wide embrace opens a window onto the culture and psychology of contemporary America, it raises an intriguing question: If almost anything can be an object of connoisseurship — and if, by implication, almost anyone can be a connoisseur — does the concept still suggest the fine and rare qualities that make it so appealing?
Not anymore. Driven by a relentless quest for “the best,” we increasingly see every item we place in our grocery basket or Internet shopping cart as a reflection of our discrimination and taste. We are not consumers. We have a higher calling. We are connoisseurs.
Connoisseurship has never been more popular. Long confined to the serious appreciation of high art and classical music, it is now applied to an endless cascade of pursuits. Leading publications, including The New York Times, routinely discuss the connoisseurship of coffee, cupcakes and craft beers; of cars, watches, fountain pens, lunchboxes, stereo systems and computers; of tacos, pizza, pickles, chocolate, mayonnaise, cutlery and light (yes, light, which is not to be confused with the specialized connoisseurship of lighting). And the Grateful Dead, of course.
This democratization of connoisseurship is somewhat surprising since as recently as the social upheavals of the 1960s and ’70s connoisseurship was a “dirty word” — considered “elitist, artificial, subjective and mostly imaginary,” said Laurence B. Kanter, chief curator of the Yale University Art Gallery. Today, it is a vital expression of how many of us we want to see, and distinguish, ourselves.
As its wide embrace opens a window onto the culture and psychology of contemporary America, it raises an intriguing question: If almost anything can be an object of connoisseurship — and if, by implication, almost anyone can be a connoisseur — does the concept still suggest the fine and rare qualities that make it so appealing?
Two Years
[ed. It's our second anniversary! Thanks to everyone who's ever stopped by, I hope you've found something interesting to take with you. We'll try to make year three even better.]
Sunday, February 24, 2013
Iditarod Trails Athlete Beats Path Toward Nome
Only Hewitt doesn't winter out. Hewitt summers out and comes here in winter to hike the Iditarod Trail, the whole 1,000 miles of it from the old Cook Inlet port of Knik north through the frozen heart of the 49th state to the still-thriving gold-mining community of Nome on the Bering Sea. He's already done this six times, more than any person alive. He's back for the seventh try this year.
The desolate, lonely, little-traveled Iditarod in winter offers what Hewitt considers the ultimate "vacation." Forget the howling winds of the Alaska Range or Bering Sea coast that can knock a man off his feet. Ignore the brutal, 50-degree-below-zero cold of the Interior that killed the protagonist in author Jack London's classic short story "To Build a Fire.'' Just keep moving and you'll be fine. That's Hewitt's mantra.
The man doesn't belong in this century. He would fit better in the Alaska of 1913 than that of 2013. He doesn't seem to understand that the serious Iditarod competitions of the modern day are dominated by the gas-powered, fire-breathing snowmobiles of the Iron Dog that can hit 100 mph, and Iditarod Trail Sled Dog Race canines that are now more hound than husky, pulling carbon-fiber dogsleds driven by professional mushers with celebrity-size egos.
Hewitt seems to have no ego, though he should. Last year he accomplished a feat unimaginable in the world of human-powered endurance competitions. He led the Iditarod Trail Invitational race for about 200 miles to the crest of the Alaska Range. The Invitational is an event open to anyone and any machine powered by human muscles. Like Hewitt, it is a throwback to days when people competed in sport for the sheer joy of competition, not for the money, nor even for the glory.
The Invitational offers no cash prize, and there is little fame attached to success outside the small world of extreme endurance athletes. Who knows, for instance, that Hewitt set a foot-racing record for the Iditarod on the fifth of his six trips up the trail?
He made it to Nome in 20 days, 7 hours and 17 minutes in 2011. That is a little more than seven hours faster than the time it took the dog team of the late Carl Huntington, the only musher in history to win both the Iditarod and Fur Rendezvous World Championship sprint race, to reach the finish line during his victorious Iditarod of 1974.
by Craig Medred, Alaska Dispatch | Read more:
Photo: Craig Medred
I'm a Shut-in. This is My Story.
[ed. Once in a while something really amazing comes completely out of left field. Like this. There's an extraordinarily talented writer at work here.]
For five years I have been a recluse. I don't leave the house for months at a time1. I venture out into the world only when it is necessary to maintain my isolation. I'm not agoraphobic, I'm not depressed, and I'm not insane 2. I simply don't socialize.
There are a lot of names for people like me. We are called shut-ins, hermits, recluses and so on. These words mean different things depending on what media you have been exposed to. To some, a hermit is a monastic human living high in the Himalayas connecting with his inner self through meditation and isolation. Some picture a crazy, bearded, old fellow, cooking up whiskey deep in the Appalachian wilderness. Some picture a Howard Hughes type, they imagine man that harvests his fignernails and wears tinfoil hats to keep the aliens out.
Preconceptions are a difficult thing to overcome. The meanings we assume of words are our biggest obstacle to communication. Instead of fighting an uphill battle against meanings, let us leave the words we know behind and introduce a new one.
Hikikomori is a Japanese word which means "pulling inward". It has been used as a label to describe an emerging phenomenon in Japan, that of adolescents withdrawing from the world. We aren't going to stick to any hard definitions of hikikimori. Instead, we are going to use it only as a convenient placeholder to refer to a spectrum of individuals similar to, but not necessarily, like me.
The label will be used as a tool in uncovering meaning, it wont be the meaning; meaning is not a label. Set aside any biases, hold back any prejudices and save judgment for later. We can always figure out how to flame me later. Complimentary rocks and pitchforks will be provided next to the comment section.
You don't just get up one day and say "Fuck it, people suck. I'm not going out anymore". It's not that you cant do that, believe me, there are people that can and do, it's just that the world wont let you. If you just quit the world immediately, without any warning, then the world freaks out; a million text messages will be sent, cops will be called to check on you, interventions will be held, walruses will be dispatched on rescue scooters. Well not that last one, but I have to keep you, the reader, on your toes. (...)
I have never emotionally imploded but I imagine it's much like a Californication episode or one of those coming of age novels where the depressed protagonist loves that girl but that girl doesn't love him so he like is all sad about unrequited love so he gets really down and does something stupid like take a lot of pills and try to ride his bike 4 and then through a series of unlikely events he meets this manic pixie dream girl let's call her Sam and she is like all kinds of adorkable and she has them anime eyes and she has this friend Garry that is a little bit Autistic and he thinks the whole world is actually just a run-on story on a collision course with a period and if they don't act exactly like the teenager writing style trope they will all die and the protagonist is taught how to live and falls in love and they.
The point is that you cant just up and quit the world. To leave the world completely one has to cut ties slowly and steadily. You have to tug, warp, twist and tear at your connections until they're stressed enough to break. It takes systematic and conscious effort to leave the world.
It takes a "special" type of person to be willing to be push everything and everyone away until nothing is left. To understand how I became such a "special" person, we have to start at my beginning. This is the story of how I faded from the world.
For five years I have been a recluse. I don't leave the house for months at a time1. I venture out into the world only when it is necessary to maintain my isolation. I'm not agoraphobic, I'm not depressed, and I'm not insane 2. I simply don't socialize.
There are a lot of names for people like me. We are called shut-ins, hermits, recluses and so on. These words mean different things depending on what media you have been exposed to. To some, a hermit is a monastic human living high in the Himalayas connecting with his inner self through meditation and isolation. Some picture a crazy, bearded, old fellow, cooking up whiskey deep in the Appalachian wilderness. Some picture a Howard Hughes type, they imagine man that harvests his fignernails and wears tinfoil hats to keep the aliens out.
Preconceptions are a difficult thing to overcome. The meanings we assume of words are our biggest obstacle to communication. Instead of fighting an uphill battle against meanings, let us leave the words we know behind and introduce a new one.
Hikikomori is a Japanese word which means "pulling inward". It has been used as a label to describe an emerging phenomenon in Japan, that of adolescents withdrawing from the world. We aren't going to stick to any hard definitions of hikikimori. Instead, we are going to use it only as a convenient placeholder to refer to a spectrum of individuals similar to, but not necessarily, like me.
The label will be used as a tool in uncovering meaning, it wont be the meaning; meaning is not a label. Set aside any biases, hold back any prejudices and save judgment for later. We can always figure out how to flame me later. Complimentary rocks and pitchforks will be provided next to the comment section.
You don't just get up one day and say "Fuck it, people suck. I'm not going out anymore". It's not that you cant do that, believe me, there are people that can and do, it's just that the world wont let you. If you just quit the world immediately, without any warning, then the world freaks out; a million text messages will be sent, cops will be called to check on you, interventions will be held, walruses will be dispatched on rescue scooters. Well not that last one, but I have to keep you, the reader, on your toes. (...)
I have never emotionally imploded but I imagine it's much like a Californication episode or one of those coming of age novels where the depressed protagonist loves that girl but that girl doesn't love him so he like is all sad about unrequited love so he gets really down and does something stupid like take a lot of pills and try to ride his bike 4 and then through a series of unlikely events he meets this manic pixie dream girl let's call her Sam and she is like all kinds of adorkable and she has them anime eyes and she has this friend Garry that is a little bit Autistic and he thinks the whole world is actually just a run-on story on a collision course with a period and if they don't act exactly like the teenager writing style trope they will all die and the protagonist is taught how to live and falls in love and they.
The point is that you cant just up and quit the world. To leave the world completely one has to cut ties slowly and steadily. You have to tug, warp, twist and tear at your connections until they're stressed enough to break. It takes systematic and conscious effort to leave the world.
It takes a "special" type of person to be willing to be push everything and everyone away until nothing is left. To understand how I became such a "special" person, we have to start at my beginning. This is the story of how I faded from the world.
by K-2052 | Read more:
h/t Longreads
Patti Smith: Just Kids
[ed. Just finishing this and it's very good. I wasn't much interested when it first came out (not that much of a PS fan) but I picked up a copy at the used bookstore and realized it's as much about Robert Mapplethorpe as it is Patti Smith - as the review says, a twofer. Leaves you with the sense of a uniquely complex and loving relationship.]
Apart from a certain shared apprehension of immortality — complacent in one case, but endearingly gingerly in the other — the skinny 28-year-old on the cover of Patti Smith’s seismic 1975 album, “Horses,” doesn’t look much at all like Picasso’s portrait of Gertrude Stein. But because the shutterbug was Robert Mapplethorpe, who was soon to become fairly legendary himself, that exquisite photograph of Smith on the brink of fame is as close as New York’s 1970s avant-garde ever came to a comparable twofer. The mythmaking bonus is that the latter-day duo were much more genuinely kindred spirits.
Born weeks apart in 1946, Smith and Mapplethorpe played Mutt and Jeff from their first meeting in 1967 through his death from AIDS more than 20 years later. They were lovers as well until he came out of the closet with more anguish than anyone familiar with his bold later career as gay sexuality’s answer to Mathew Brady (and Jesse Helms’s N.E.A. nemesis) is likely to find credible. Yet his Catholic upbringing had been conservative enough that he and Smith had to fake being married for his parents’ sake during their liaison.
Though Smith moved on to other partners, including the playwright Sam Shepard and the Blue Oyster Cult keyboardist-guitarist Allen Lanier, her attachment to Mapplethorpe didn’t wane. After years of mimicking her betters at poetry, she found her calling — “Three chords merged with the power of the word,” to quote the memorable slogan she came up with — at around the same time he quit mimicking his betters at bricolage to turn photographer full time. “Patti, you got famous before me,” he half-moped and half-teased when “Because the Night,” her only genuine hit single, went Top 20 in 1978. Even so, his “before” turned out to be prescient. (...)
No nostalgist about her formative years, Smith makes us feel the pinched prospects that led her to ditch New Jersey for a vagabond life in Manhattan. Her mother’s parting gift was a waitress’s uniform: “You’ll never make it as a waitress, but I’ll stake you anyway.” That prediction came true, but Smith did better — dressed as “Anna Karina in ‘Bande à Part,’ ” a uniform of another sort — clerking at Scribner’s bookstore. That job left Mapplethorpe free to doodle while she earned their keep, which she didn’t mind. “My temperament was sturdier,” she explains, something her descriptions of his moues confirm. Even when they were poor and unknown, he spent more time deciding which outfit to wear than some of us do on our taxes.
Soon they were ensconced at “a doll’s house in the Twilight Zone”: the Chelsea Hotel, home to a now fabled gallery of eccentrics and luminaries that included Harry Smith, the compiler of “The Anthology of American Folk Music” and the subject of some of her most affectionately exasperated reminiscences. For respite, there was Coney Island, where a coffee shack gives Smith one of her best time-capsule moments: “Pictures of Jesus, President Kennedy and the astronauts were taped to the wall behind the register.” That “and the astronauts” is so perfect you wouldn’t be sure whether to give her more credit for remembering it or inventing it.
Valhalla for them both was the back room at Max’s Kansas City, where Andy Warhol, Mapplethorpe’s idol, once held court. By the time they reached the sanctum, though, Warhol was in seclusion after his shooting by Valerie Solanas in 1968, leaving would-be courtiers and Factory hopefuls “auditioning for a phantom.” Smith also wasn’t as smitten as Mapplethorpe with Warhol’s sensibility: “I hated the soup and felt little for the can,” she says flatly, leaving us not only chortling at her terseness but marveling at the distinction. Yet Pop Art’s Wizard of Oz looms over “Just Kids” even in absentia, culminating in a lovely image of a Manhattan snowfall — as “white and fleeting as Warhol’s hair” — on the night of his death.
Inevitably, celebrity cameos abound. They range from Smith’s brief encounter with Salvador Dalí — “Just another day at the Chelsea,” she sighs — to her vivid sketch of the young Sam Shepard, with whom she collaborated on the play “Cowboy Mouth.” Among the most charming vignettes is her attempted pickup in an automat (“a real Tex Avery eatery”) by Allen Ginsberg, who buys the impoverished Smith a sandwich under the impression she’s an unusually striking boy. The androgynous and bony look she was to make so charismatic with Mapplethorpe’s help down the road apparently confused others as well: “You don’t shoot up and you’re not a lesbian,” one wit complains. “What do you actually do?”
Apart from a certain shared apprehension of immortality — complacent in one case, but endearingly gingerly in the other — the skinny 28-year-old on the cover of Patti Smith’s seismic 1975 album, “Horses,” doesn’t look much at all like Picasso’s portrait of Gertrude Stein. But because the shutterbug was Robert Mapplethorpe, who was soon to become fairly legendary himself, that exquisite photograph of Smith on the brink of fame is as close as New York’s 1970s avant-garde ever came to a comparable twofer. The mythmaking bonus is that the latter-day duo were much more genuinely kindred spirits.
Born weeks apart in 1946, Smith and Mapplethorpe played Mutt and Jeff from their first meeting in 1967 through his death from AIDS more than 20 years later. They were lovers as well until he came out of the closet with more anguish than anyone familiar with his bold later career as gay sexuality’s answer to Mathew Brady (and Jesse Helms’s N.E.A. nemesis) is likely to find credible. Yet his Catholic upbringing had been conservative enough that he and Smith had to fake being married for his parents’ sake during their liaison.
Though Smith moved on to other partners, including the playwright Sam Shepard and the Blue Oyster Cult keyboardist-guitarist Allen Lanier, her attachment to Mapplethorpe didn’t wane. After years of mimicking her betters at poetry, she found her calling — “Three chords merged with the power of the word,” to quote the memorable slogan she came up with — at around the same time he quit mimicking his betters at bricolage to turn photographer full time. “Patti, you got famous before me,” he half-moped and half-teased when “Because the Night,” her only genuine hit single, went Top 20 in 1978. Even so, his “before” turned out to be prescient. (...)
No nostalgist about her formative years, Smith makes us feel the pinched prospects that led her to ditch New Jersey for a vagabond life in Manhattan. Her mother’s parting gift was a waitress’s uniform: “You’ll never make it as a waitress, but I’ll stake you anyway.” That prediction came true, but Smith did better — dressed as “Anna Karina in ‘Bande à Part,’ ” a uniform of another sort — clerking at Scribner’s bookstore. That job left Mapplethorpe free to doodle while she earned their keep, which she didn’t mind. “My temperament was sturdier,” she explains, something her descriptions of his moues confirm. Even when they were poor and unknown, he spent more time deciding which outfit to wear than some of us do on our taxes.
Soon they were ensconced at “a doll’s house in the Twilight Zone”: the Chelsea Hotel, home to a now fabled gallery of eccentrics and luminaries that included Harry Smith, the compiler of “The Anthology of American Folk Music” and the subject of some of her most affectionately exasperated reminiscences. For respite, there was Coney Island, where a coffee shack gives Smith one of her best time-capsule moments: “Pictures of Jesus, President Kennedy and the astronauts were taped to the wall behind the register.” That “and the astronauts” is so perfect you wouldn’t be sure whether to give her more credit for remembering it or inventing it.
Valhalla for them both was the back room at Max’s Kansas City, where Andy Warhol, Mapplethorpe’s idol, once held court. By the time they reached the sanctum, though, Warhol was in seclusion after his shooting by Valerie Solanas in 1968, leaving would-be courtiers and Factory hopefuls “auditioning for a phantom.” Smith also wasn’t as smitten as Mapplethorpe with Warhol’s sensibility: “I hated the soup and felt little for the can,” she says flatly, leaving us not only chortling at her terseness but marveling at the distinction. Yet Pop Art’s Wizard of Oz looms over “Just Kids” even in absentia, culminating in a lovely image of a Manhattan snowfall — as “white and fleeting as Warhol’s hair” — on the night of his death.
Inevitably, celebrity cameos abound. They range from Smith’s brief encounter with Salvador Dalí — “Just another day at the Chelsea,” she sighs — to her vivid sketch of the young Sam Shepard, with whom she collaborated on the play “Cowboy Mouth.” Among the most charming vignettes is her attempted pickup in an automat (“a real Tex Avery eatery”) by Allen Ginsberg, who buys the impoverished Smith a sandwich under the impression she’s an unusually striking boy. The androgynous and bony look she was to make so charismatic with Mapplethorpe’s help down the road apparently confused others as well: “You don’t shoot up and you’re not a lesbian,” one wit complains. “What do you actually do?”
Subscribe to:
Posts (Atom)