Saturday, June 4, 2016
We Have No Idea What Aging Looks Like
My friend Deborah from college loves to tell this story: One of the first times we hung out, we started talking about her solo travels to Burma and assorted other spots in Southeast Asia. I was 19 years old, and like most 19-year-olds, nearly all my friends were people I met through school in some fashion, meaning that virtually all my friends were people within a two-year age range of myself (four years max, though given the dynamics of high school and even collegiate hierarchies, anything more than two years was a stretch). But as she was regaling me with her thrilling tales, I realized she couldn’t have traveled so extensively if she were my age, and it dawned on me that I was talking to someone Older.
I’d heard you weren’t supposed to ask people how old they were—what if they were Old?!—but I couldn’t help myself. I asked her how old she was, and she told me, and, according to her, I gasped, fluttered my hand to my chest, and said, “But you look so good!“
Deborah was 26.
I turn 40 this week, and this story, which was embarrassing to me the first time she told it—she had the good sense to wait to relay it to me until I was in my 30s and therefore old enough to appreciate it—has now become hilarious. It’s hilarious that I thought 26 was shockingly old, and that I thought 26 would be old enough to show signs of aging in a way that would be detrimental to one’s conventional beauty. (In fact, it seems that would be anything over 31, if we’re going by sheer numbers here—and while I’m tempted to call bullshit on that, given that people may be more satisfied with their looks the older they get, I also know that age 31 was probably when I looked objectively my best.)
We still don’t really know what aging looks like. Certainly younger people don’t, and everyone reading this is younger than someone. I used to be vaguely flattered when younger people would express surprise when I’d mention my age, until I recalled my own response to Deborah’s ancient 26. It wasn’t that I knew what 26 looked like and that she looked younger than that; it was that I had no idea what looking 26 might actually entail, just that it was older than what I’d been led to believe was the height of my own attractiveness, and that therefore the fact that she looked great at 26 meant she was an outlier and therefore warranted a cry of “But you look sogood!” When a younger person tells me I “don’t look 40”—or, my favorite, that I’m “well preserved” (!), I accept it with grace but always wonder if they’ll later recall that moment with their own embarrassment. Because I do look 40, and I’mnotparticularly “preserved.” They just have no idea what 40 looks like, and it’s not their fault. Until it was within eyeshot, I didn’t know myself.
What we consider older (or younger) is always in relation to ourselves. Older was once my 26-year-old friend; now that my circle of friends has loosened beyond the age constrictions of school and I have friends in their 50s, even people in their 60s don’t seem so old to me. My parents, once hopelessly old to me, I now see as—I can’t say young, but when I wanted to talk about Mad Men with them, my mother said they were saving television for “deep retirement.” Meaning not the retirement they’re in now—my father retired from paid work nearly 10 years ago, and my mother retired from homemaking as well, a feminist arrangement I adore—but a later form of retirement, when they’re too frail to travel extensively as they’re doing now. That is: When they’re Old.
There’s a particular sort of human-interest news piece that takes a person over 70 who is doing something—anything, really—and treats the fact that they are not sewn into a La-Z-Boy as a small miracle. We are supposed to find this inspiring, and I suppose it is. But it is not unique. The fact that younger folk still regard active elderly people as outliers says little about them, and everything about us. We expect old people to curl up and—well, die, I suppose (though our society is still so scared shitless of death that we spend 28 percent of our Medicare dollars in the last six months of life). So when they don’t, we’re surprised, even though we shouldn’t be. There are indeed old people who spend their days mostly watching television and complaining about their aches, but there are young people who do that too. My grandmother, who turns 90 next month, teaches line dancing lessons at her retirement home. I’m proud of her. She is not an outlier.
This idea that old people—whatever each of us considers to be old—are outliers for not fitting into what we expect of them goes double for beauty. That makes a sort of sense, given that the hallmarks of beauty are so closely associated with youth, so when a woman of a certain age still has some of those hallmarks, it is remarkable. Except: It’s not, not really, given that so much of the attention we do give to famous older women has less to do with their beauty and more with their grooming. Take the case of Helen Mirren, whom the media has long crowned as the sexy senior (which started happening 15 years ago, incidentally, back when she was the same age Julia Louis-Dreyfus is now). She’s a lovely woman, and exceptionally accomplished, but the attention paid to her sex appeal after age 50 has largely been about her refusal to style herself in a matronly fashion. (I don’t know enough about celebrity fashion to say for sure, but I’m guessing that she ushered in today’s era, when celebrities over 50 aren’t afraid to show some skin, and look great in it.) When I walk through this city, I see a lot of older women who groom themselves just as beautifully, and I’m not just talking about the Iris Apfels of the world. I’m talking my gym buddy Lynn, whose loose bun and oh-so-slightly-off-the-shoulder tees echo her life as a dancer; I’m talking my neighbor Dorothy, whose loose movie-star curls fall in her face when she talks; I’m talking real women you know, who take care of themselves, and who may or may not have the bone structure of Carmen Dell’Orefice but who look pretty damn good anyway. Part of the joke of Amy Schumer’s sublime “Last Fuckable Day” sketch was the fact that all of the women in it were perfectly good-looking. We know that women don’t shrivel up and die after 50, but we’re still not sure how to truly acknowledge it, so we continue to rely on outdated conversations about aging. I mean, the opening slide of that Amy Schumer sketch is: “Uncensored: Hide Your Mom.”
There’s a paradox built into acknowledging older women’s beauty: By calling attention to both their appearance and their age, we continue to treat older women who continue an otherwise unremarkable level of grooming as exceptions. That’s not to say that we shouldn’t do so; Advanced Style, for example, is near-radical in its presentation of older women, and I’d hate for it to become just…Style. And I absolutely don’t want to say that we should start sweeping older women back under the male gaze; escaping that level of scrutiny is one of the benefits of growing older. I’m also aware of the folly of using the way we talk about celebrities as a stand-in for how we talk about age more generally—the only people whose ages we collectively examine are famous people, whose ages only come up for discussion in regard to looks if we’re all like A) Wow, that person doesn’t look that old (Cicely Tyson, 91), or B) Wow, that person looks way older than that (Ted Cruz, 45). Nobody is like, Wow, Frances McDormand is 58? And she looks it too! Still, celebrities are a useful comparison point for how our notions of age are changing, even if the ways we talk about it aren’t. Anne Bancroft was 36 when she was cast as Mrs. Robinson. A selection of women who are 36 today: Zooey Deschanel, Laura Prepon, Mindy Kaling, Rosamund Pike, Claire Danes. Kim Kardashian turns 36 in October. Can you imagine any of these people being cast as a scandalously older woman today?
I’d heard you weren’t supposed to ask people how old they were—what if they were Old?!—but I couldn’t help myself. I asked her how old she was, and she told me, and, according to her, I gasped, fluttered my hand to my chest, and said, “But you look so good!“
Deborah was 26.

We still don’t really know what aging looks like. Certainly younger people don’t, and everyone reading this is younger than someone. I used to be vaguely flattered when younger people would express surprise when I’d mention my age, until I recalled my own response to Deborah’s ancient 26. It wasn’t that I knew what 26 looked like and that she looked younger than that; it was that I had no idea what looking 26 might actually entail, just that it was older than what I’d been led to believe was the height of my own attractiveness, and that therefore the fact that she looked great at 26 meant she was an outlier and therefore warranted a cry of “But you look sogood!” When a younger person tells me I “don’t look 40”—or, my favorite, that I’m “well preserved” (!), I accept it with grace but always wonder if they’ll later recall that moment with their own embarrassment. Because I do look 40, and I’mnotparticularly “preserved.” They just have no idea what 40 looks like, and it’s not their fault. Until it was within eyeshot, I didn’t know myself.
What we consider older (or younger) is always in relation to ourselves. Older was once my 26-year-old friend; now that my circle of friends has loosened beyond the age constrictions of school and I have friends in their 50s, even people in their 60s don’t seem so old to me. My parents, once hopelessly old to me, I now see as—I can’t say young, but when I wanted to talk about Mad Men with them, my mother said they were saving television for “deep retirement.” Meaning not the retirement they’re in now—my father retired from paid work nearly 10 years ago, and my mother retired from homemaking as well, a feminist arrangement I adore—but a later form of retirement, when they’re too frail to travel extensively as they’re doing now. That is: When they’re Old.
There’s a particular sort of human-interest news piece that takes a person over 70 who is doing something—anything, really—and treats the fact that they are not sewn into a La-Z-Boy as a small miracle. We are supposed to find this inspiring, and I suppose it is. But it is not unique. The fact that younger folk still regard active elderly people as outliers says little about them, and everything about us. We expect old people to curl up and—well, die, I suppose (though our society is still so scared shitless of death that we spend 28 percent of our Medicare dollars in the last six months of life). So when they don’t, we’re surprised, even though we shouldn’t be. There are indeed old people who spend their days mostly watching television and complaining about their aches, but there are young people who do that too. My grandmother, who turns 90 next month, teaches line dancing lessons at her retirement home. I’m proud of her. She is not an outlier.
This idea that old people—whatever each of us considers to be old—are outliers for not fitting into what we expect of them goes double for beauty. That makes a sort of sense, given that the hallmarks of beauty are so closely associated with youth, so when a woman of a certain age still has some of those hallmarks, it is remarkable. Except: It’s not, not really, given that so much of the attention we do give to famous older women has less to do with their beauty and more with their grooming. Take the case of Helen Mirren, whom the media has long crowned as the sexy senior (which started happening 15 years ago, incidentally, back when she was the same age Julia Louis-Dreyfus is now). She’s a lovely woman, and exceptionally accomplished, but the attention paid to her sex appeal after age 50 has largely been about her refusal to style herself in a matronly fashion. (I don’t know enough about celebrity fashion to say for sure, but I’m guessing that she ushered in today’s era, when celebrities over 50 aren’t afraid to show some skin, and look great in it.) When I walk through this city, I see a lot of older women who groom themselves just as beautifully, and I’m not just talking about the Iris Apfels of the world. I’m talking my gym buddy Lynn, whose loose bun and oh-so-slightly-off-the-shoulder tees echo her life as a dancer; I’m talking my neighbor Dorothy, whose loose movie-star curls fall in her face when she talks; I’m talking real women you know, who take care of themselves, and who may or may not have the bone structure of Carmen Dell’Orefice but who look pretty damn good anyway. Part of the joke of Amy Schumer’s sublime “Last Fuckable Day” sketch was the fact that all of the women in it were perfectly good-looking. We know that women don’t shrivel up and die after 50, but we’re still not sure how to truly acknowledge it, so we continue to rely on outdated conversations about aging. I mean, the opening slide of that Amy Schumer sketch is: “Uncensored: Hide Your Mom.”
There’s a paradox built into acknowledging older women’s beauty: By calling attention to both their appearance and their age, we continue to treat older women who continue an otherwise unremarkable level of grooming as exceptions. That’s not to say that we shouldn’t do so; Advanced Style, for example, is near-radical in its presentation of older women, and I’d hate for it to become just…Style. And I absolutely don’t want to say that we should start sweeping older women back under the male gaze; escaping that level of scrutiny is one of the benefits of growing older. I’m also aware of the folly of using the way we talk about celebrities as a stand-in for how we talk about age more generally—the only people whose ages we collectively examine are famous people, whose ages only come up for discussion in regard to looks if we’re all like A) Wow, that person doesn’t look that old (Cicely Tyson, 91), or B) Wow, that person looks way older than that (Ted Cruz, 45). Nobody is like, Wow, Frances McDormand is 58? And she looks it too! Still, celebrities are a useful comparison point for how our notions of age are changing, even if the ways we talk about it aren’t. Anne Bancroft was 36 when she was cast as Mrs. Robinson. A selection of women who are 36 today: Zooey Deschanel, Laura Prepon, Mindy Kaling, Rosamund Pike, Claire Danes. Kim Kardashian turns 36 in October. Can you imagine any of these people being cast as a scandalously older woman today?
by Autumn Whitefield-Madrano, New Inquiry | Read more:
Image: uncredited
Muhammad Ali (January, 1942 - June, 2016)
[ed. The Greatest. See also: The Outsized Life of Muhammad Ali.]
Friday, June 3, 2016
Bots are awesome! Humans? Not so much.
[ed. wtf?]
In the past few days my personal resume bot has exchanged over 24,000 messages via Facebook Messenger and SMS. It’s chatted with folks from every industry and has introduced me to people at Facebook, Microsoft, and Google — plus a half dozen small, compelling teams.
What I learned about humans and AI while sifting through those conversations is fascinating and also a little disturbing.
I’ve distilled that data into useful nuggets you should consider before jumping on the bot bandwagon.
The Backstory of #EstherBot
Earlier this week I built and launched EstherBot, a personal resume bot that can tell you about my career, interests, and values. It shot to the #2 spot on Product Hunt and my Medium post about why and how I built it spread like wildfire – racking up over 1k recommends. (Get instructions for building your own free bot here.)
EstherBot speaks to the current zeitgeist. The era of messaging has arrived along with a botpocalypse, but few people have seen examples that go beyond the personal assistant, travel butler, or shopping concierge. To some, those feel like solutions for the 1% rather than the 99%.
EstherBot is relatable and understandable. The idea is simple — the resume hasn’t really changed that much in the digital age. While you’re producing all this information about yourself in the way that you use social media, your resume doesn’t actively seek out opportunities that you might be interested in. Your resume doesn’t constantly learn and get better by observing you. Instead, you have to do all this manual work, just like you used to. Why?
There’s a ton of data that could be used to connect you to better opportunities. Data including hobbies, values, location preferences, multimedia samples of your work. On and on. A resume simply can’t hold all of that, but a bot can.
In the past few days my personal resume bot has exchanged over 24,000 messages via Facebook Messenger and SMS. It’s chatted with folks from every industry and has introduced me to people at Facebook, Microsoft, and Google — plus a half dozen small, compelling teams.

I’ve distilled that data into useful nuggets you should consider before jumping on the bot bandwagon.
The Backstory of #EstherBot
Earlier this week I built and launched EstherBot, a personal resume bot that can tell you about my career, interests, and values. It shot to the #2 spot on Product Hunt and my Medium post about why and how I built it spread like wildfire – racking up over 1k recommends. (Get instructions for building your own free bot here.)
EstherBot speaks to the current zeitgeist. The era of messaging has arrived along with a botpocalypse, but few people have seen examples that go beyond the personal assistant, travel butler, or shopping concierge. To some, those feel like solutions for the 1% rather than the 99%.
EstherBot is relatable and understandable. The idea is simple — the resume hasn’t really changed that much in the digital age. While you’re producing all this information about yourself in the way that you use social media, your resume doesn’t actively seek out opportunities that you might be interested in. Your resume doesn’t constantly learn and get better by observing you. Instead, you have to do all this manual work, just like you used to. Why?
There’s a ton of data that could be used to connect you to better opportunities. Data including hobbies, values, location preferences, multimedia samples of your work. On and on. A resume simply can’t hold all of that, but a bot can.
by Esther Crawford, Chatbots Magazine | Read more:
Image: uncredited
Fraying at the Edges
It began with what she saw in the bathroom mirror. On a dull morning, Geri Taylor padded into the shiny bathroom of her Manhattan apartment. She casually checked her reflection in the mirror, doing her daily inventory. Immediately, she stiffened with fright.
Huh? What?
She didn’t recognize herself.
She gazed saucer-eyed at her image, thinking: Oh, is this what I look like? No, that’s not me. Who’s that in my mirror?
This was in late 2012. She was 69, in her early months getting familiar with retirement. For some time she had experienced the sensation of clouds coming over her, mantling thought. There had been a few hiccups at her job. She had been a nurse who climbed the rungs to health care executive. Once, she was leading a staff meeting when she had no idea what she was talking about, her mind like a stalled engine that wouldn’t turn over.
“Fortunately I was the boss and I just said, ‘Enough of that; Sally, tell me what you’re up to,’” she would say of the episode.
Certain mundane tasks stumped her. She told her husband, Jim Taylor, that the blind in the bedroom was broken. He showed her she was pulling the wrong cord. Kept happening. Finally, nothing else working, he scribbled on the adjacent wall which cord was which.
Then there was the day she got off the subway at 14th Street and Seventh Avenue unable to figure out why she was there.
So, yes, she had had inklings that something was going wrong with her mind. She held tight to these thoughts. She even hid her suspicions from Mr. Taylor, who chalked up her thinning memory to the infirmities of age. “I thought she was getting like me,” he said. “I had been forgetful for 10 years.”
But to not recognize her own face! To Ms. Taylor, this was the “drop-dead moment” when she had to accept a terrible truth. She wasn’t just seeing the twitches of aging but the early fumes of the disease.
She had no further issues with mirrors, but there was no ignoring that something important had happened. She confided her fears to her husband and made an appointment with a neurologist. “Before then I thought I could fake it,” she would explain. “This convinced me I had to come clean.”
In November 2012, she saw the neurologist who was treating her migraines. He listened to her symptoms, took blood, gave her the Mini Mental State Examination, a standard cognitive test made up of a set of unremarkable questions and commands. (For instance, she was asked to count backward from 100 in intervals of seven; she had to say the phrase: “No ifs, ands or buts”; she was told to pick up a piece of paper, fold it in half and place it on the floor beside her.)
He told her three common words, said he was going to ask her them in a little bit. He emphasized this by pointing a finger at his head — remember those words. That simple. Yet when he called for them, she knew only one: beach. In her mind, she would go on to associate it with the doctor, thinking of him as Dr. Beach.
He gave a diagnosis of mild cognitive impairment, a common precursor to Alzheimer’s disease. The first label put on what she had. Even then, she understood it was the footfall of what would come. Alzheimer’s had struck her father, a paternal aunt and a cousin. She long suspected it would eventually find her.
Every 67 seconds, with monotonous cruelty, Alzheimer’s takes up residence in another American. Degenerative and incurable, it is democratic in its reach. People live with it about eight to 10 years on average, though some people last for 20 years. More than five million Americans are believed to have it, two-thirds of them women, and now Ms. Taylor would join them.
The disease, with its thundering implications, moves in worsening stages to its ungraspable end. That is the familiar face of Alzheimer’s, the withered person with the scrambled mind marooned in a nursing home, memories sealed away, aspirations for the future discontinued. But there is also the beginning, the waiting period.
That was Geri Taylor. Waiting.
Right now, she remained energized, in control of her life, the silent attack on her brain not yet in full force. But what about next week? Next month? Next year? The disease would be there then. And the year after. And forever. It has no easy parts. It nicks away at you, its progress messy and unpredictable.
“The beginning is like purgatory,” she said one day. “It’s kind of a grace period. You’re waiting for something. Something you don’t want to come. It’s like a before-hell purgatory.”
Huh? What?
She didn’t recognize herself.

This was in late 2012. She was 69, in her early months getting familiar with retirement. For some time she had experienced the sensation of clouds coming over her, mantling thought. There had been a few hiccups at her job. She had been a nurse who climbed the rungs to health care executive. Once, she was leading a staff meeting when she had no idea what she was talking about, her mind like a stalled engine that wouldn’t turn over.
“Fortunately I was the boss and I just said, ‘Enough of that; Sally, tell me what you’re up to,’” she would say of the episode.
Certain mundane tasks stumped her. She told her husband, Jim Taylor, that the blind in the bedroom was broken. He showed her she was pulling the wrong cord. Kept happening. Finally, nothing else working, he scribbled on the adjacent wall which cord was which.
Then there was the day she got off the subway at 14th Street and Seventh Avenue unable to figure out why she was there.
So, yes, she had had inklings that something was going wrong with her mind. She held tight to these thoughts. She even hid her suspicions from Mr. Taylor, who chalked up her thinning memory to the infirmities of age. “I thought she was getting like me,” he said. “I had been forgetful for 10 years.”
But to not recognize her own face! To Ms. Taylor, this was the “drop-dead moment” when she had to accept a terrible truth. She wasn’t just seeing the twitches of aging but the early fumes of the disease.
She had no further issues with mirrors, but there was no ignoring that something important had happened. She confided her fears to her husband and made an appointment with a neurologist. “Before then I thought I could fake it,” she would explain. “This convinced me I had to come clean.”
In November 2012, she saw the neurologist who was treating her migraines. He listened to her symptoms, took blood, gave her the Mini Mental State Examination, a standard cognitive test made up of a set of unremarkable questions and commands. (For instance, she was asked to count backward from 100 in intervals of seven; she had to say the phrase: “No ifs, ands or buts”; she was told to pick up a piece of paper, fold it in half and place it on the floor beside her.)
He told her three common words, said he was going to ask her them in a little bit. He emphasized this by pointing a finger at his head — remember those words. That simple. Yet when he called for them, she knew only one: beach. In her mind, she would go on to associate it with the doctor, thinking of him as Dr. Beach.
He gave a diagnosis of mild cognitive impairment, a common precursor to Alzheimer’s disease. The first label put on what she had. Even then, she understood it was the footfall of what would come. Alzheimer’s had struck her father, a paternal aunt and a cousin. She long suspected it would eventually find her.
Every 67 seconds, with monotonous cruelty, Alzheimer’s takes up residence in another American. Degenerative and incurable, it is democratic in its reach. People live with it about eight to 10 years on average, though some people last for 20 years. More than five million Americans are believed to have it, two-thirds of them women, and now Ms. Taylor would join them.
The disease, with its thundering implications, moves in worsening stages to its ungraspable end. That is the familiar face of Alzheimer’s, the withered person with the scrambled mind marooned in a nursing home, memories sealed away, aspirations for the future discontinued. But there is also the beginning, the waiting period.
That was Geri Taylor. Waiting.
Right now, she remained energized, in control of her life, the silent attack on her brain not yet in full force. But what about next week? Next month? Next year? The disease would be there then. And the year after. And forever. It has no easy parts. It nicks away at you, its progress messy and unpredictable.
“The beginning is like purgatory,” she said one day. “It’s kind of a grace period. You’re waiting for something. Something you don’t want to come. It’s like a before-hell purgatory.”
by N.R. Kleinfield, NY Times | Read more:
Image: Michael Kirby Smith
13, Right Now
She slides into the car, and even before she buckles her seat belt, her phone is alight in her hands. A 13-year-old girl after a day of eighth grade.
She says hello. Her au pair asks, “Ready to go?”
She doesn’t respond, her thumb on Instagram. A Barbara Walters meme is on the screen. She scrolls, and another meme appears. Then another meme, and she closes the app. She opens BuzzFeed. There’s a story about Florida Gov. Rick Scott, which she scrolls past to get to a story about Janet Jackson, then “28 Things You’ll Understand If You’re Both British and American.” She closes it. She opens Instagram. She opens the NBA app. She shuts the screen off. She turns it back on. She opens Spotify. Opens Fitbit. She has 7,427 steps. Opens Instagram again. Opens Snapchat. She watches a sparkly rainbow flow from her friend’s mouth. She watches a YouTube star make pouty faces at the camera. She watches a tutorial on nail art. She feels the bump of the driveway and looks up. They’re home. Twelve minutes have passed.
Katherine Pommerening’s iPhone is the place where all of her friends are always hanging out. So it’s the place where she is, too. She’s on it after it rings to wake her up in the mornings. She’s on it at school, when she can sneak it. She’s on it while her 8-year-old sister, Lila, is building crafts out of beads. She sets it down to play basketball, to skateboard, to watch PG-13 comedies and sometimes to eat dinner, but when she picks it back up, she might have 64 unread messages.
Now she’s on it in the living room of her big house in McLean, Va., while she explains what it’s like to be a 13-year-old today.
“Over 100 likes is good, for me. And comments. You just comment to make a joke or tag someone.”
The best thing is the little notification box, which means someone liked, tagged or followed her on Instagram. She has 604 followers. There are only 25 photos on her page because she deletes most of what she posts. The ones that don’t get enough likes, don’t have good enough lighting or don’t show the coolest moments in her life must be deleted.
“I decide the pictures that look good,” she says. “Ones with my friends, ones that are a really nice-looking picture.”
Somewhere, maybe at this very moment, neurologists are trying to figure out what all this screen time is doing to the still-forming brains of people Katherine’s age, members of what’s known as Generation Z. Educators are trying to teach them that not all answers are Googleable. Counselors are prying them out of Internet addictions. Parents are trying to catch up by friending their kids on Facebook. (P.S. Facebook is obsolete.) Sociologists, advertisers, stock market analysts – everyone wants to know what happens when the generation born glued to screens has to look up and interact with the world.
Right now, Katherine is still looking down.
“See this girl,” she says, “she gets so many likes on her pictures because she’s posted over nine pictures saying, ‘Like all my pictures for a tbh, comment when done.’ So everyone will like her pictures, and she’ll just give them a simple tbh.”
A tbh is a compliment. It stands for “to be heard” or “to be honest.”
Katherine tosses her long brown hair behind her shoulder and ignores her black lab, Lucy, who is barking to be let out.
“It kind of, almost, promotes you as a good person. If someone says, ‘tbh you’re nice and pretty,’ that kind of, like, validates you in the comments. Then people can look at it and say ‘Oh, she’s nice and pretty.’ ”
by Jessica Contrera, Washington Post | Read more:
She says hello. Her au pair asks, “Ready to go?”

Katherine Pommerening’s iPhone is the place where all of her friends are always hanging out. So it’s the place where she is, too. She’s on it after it rings to wake her up in the mornings. She’s on it at school, when she can sneak it. She’s on it while her 8-year-old sister, Lila, is building crafts out of beads. She sets it down to play basketball, to skateboard, to watch PG-13 comedies and sometimes to eat dinner, but when she picks it back up, she might have 64 unread messages.
Now she’s on it in the living room of her big house in McLean, Va., while she explains what it’s like to be a 13-year-old today.
“Over 100 likes is good, for me. And comments. You just comment to make a joke or tag someone.”
The best thing is the little notification box, which means someone liked, tagged or followed her on Instagram. She has 604 followers. There are only 25 photos on her page because she deletes most of what she posts. The ones that don’t get enough likes, don’t have good enough lighting or don’t show the coolest moments in her life must be deleted.
“I decide the pictures that look good,” she says. “Ones with my friends, ones that are a really nice-looking picture.”
Somewhere, maybe at this very moment, neurologists are trying to figure out what all this screen time is doing to the still-forming brains of people Katherine’s age, members of what’s known as Generation Z. Educators are trying to teach them that not all answers are Googleable. Counselors are prying them out of Internet addictions. Parents are trying to catch up by friending their kids on Facebook. (P.S. Facebook is obsolete.) Sociologists, advertisers, stock market analysts – everyone wants to know what happens when the generation born glued to screens has to look up and interact with the world.
Right now, Katherine is still looking down.
“See this girl,” she says, “she gets so many likes on her pictures because she’s posted over nine pictures saying, ‘Like all my pictures for a tbh, comment when done.’ So everyone will like her pictures, and she’ll just give them a simple tbh.”
A tbh is a compliment. It stands for “to be heard” or “to be honest.”
Katherine tosses her long brown hair behind her shoulder and ignores her black lab, Lucy, who is barking to be let out.
“It kind of, almost, promotes you as a good person. If someone says, ‘tbh you’re nice and pretty,’ that kind of, like, validates you in the comments. Then people can look at it and say ‘Oh, she’s nice and pretty.’ ”
by Jessica Contrera, Washington Post | Read more:
Image: Victoria Milko
Thursday, June 2, 2016
The NFL’s Brewing Information War
When every important decision-maker in the NFL shuffled into a hotel conference room in Boca Raton, Florida, in March for the league’s annual meeting, the scene was initially predictable: Most of the head coaches wore golf shirts, and most everyone had found a way to make the meetings double as a family vacation. The atmosphere was festive, and the pools were full of league titans. It did not appear to be the setting for the opening salvo of a war over the future of the NFL, but that’s what it became.
The meeting shifted toward discussing whether coaches should be allowed to watch game film on the sidelines during contests, a practice never before allowed. According to multiple people who were in the room at the time, Ron Rivera, the head coach of the defending NFC champion Carolina Panthers, stood up and asked what the point of coaching was if, after preparing all week, video would be readily available on the sidelines anyway.
“Where does it end?” Rivera said this week in an interview with The Ringer. “Can you get text messages or go out there with an iPhone and figure out where to go? What are we creating? I know there are millennial players, but this is still a game created 100 years ago.”
Rivera’s stance was among the most notable scenes during a spring in which it became increasingly clear that technology’s role in football has created a divide. On one side, there are coaches who have an old-school view of the craft; on the other are the coaches, executives, and owners who anticipate the sport undergoing the same sort of data revolution that most industries experienced long ago.
The NFL could have the technological capabilities to make a sideline look like a Blade Runner reboot. But it already has a mountain of data — it’s just that the mountain is largely inaccessible. In an effort to facilitate progress, league officials in Boca Raton pitched the NFL’s latest data technology: a system that would allow franchises to view player-tracking data for all 32 teams. If implemented, the technology would enable clubs to monitor every movement on the field for the first time, yielding raw data on player performance. For example: A team concerned about a slow cornerback could actually find out how much slower he is than Antonio Brown, who, according to data shared on a 2015 Thursday Night Football broadcast, posted a maximum speed of 21.8 mph during the season.
The proposal for teams to have access to all raw player-tracking data did not make it past the league’s Competition Committee, a group of team executives, owners, and coaches, according to an NFL official. Certain coaches griped about what might happen if other teams or the public had access to this data, and the committee told team representatives that it was too much, too soon, preventing the matter from reaching the teams for a vote at the March meeting.
“In other industries it is crazy to think you are going to limit innovation just to protect the people who aren’t ready,” said Brian Kopp, president of North America for Catapult Sports, which says it has deals with 19 NFL teams to provide practice data, but not game data. “Let’s make it all equally competitive, which is: You don’t figure it out, you start losing and you lose your job.”
Rivera, who’s coached the Panthers since 2011 and serves on a subcommittee of the NFL Competition Committee, said that introducing too much technology could “take the essence” out of the sport.
“I want to get beat on the field. I don’t want to get beat because someone used a tool or technology — that is not coaching at that point,” Rivera said. “I work all week, I’m preparing and kicking your ass. All of the sudden you see a piece of live video and you figure out, ‘Oh crap, that’s what he’s doing.’ And how fair is that?”
Two seasons ago, some NFL players began wearing two tiny chips in their shoulder pads during games. The program expanded to all players this past season, when Zebra Technologies, the company that produces the chips, also outfitted every stadium with receivers that decipher all movements on the field, measuring everything from player speed to how open a pass-catcher manages to get on a given play.
If you’re suddenly worried that you’re the only one missing out on crucial pieces of football analysis, rest assured, you’re not: Aside from a few nuggets sprinkled into television broadcasts, fans don’t have access to most league-wide data. More alarmingly, teams don’t have access to any league-wide data during the season, and, according to league officials, didn’t even get their own data from the 2015 season until three weeks ago.
Though football enthusiasts often praise the NFL for being forward-thinking, it has actually lagged behind other professional leagues amid an otherwise widespread analytics revolution, with a player-tracking section on NBA.com and MLB allowing the public to access its PITCHf/x data for research and modeling purposes. While NFL teams have hired analysts for front-office roles and external parties have created websites aimed at tracking advanced statistics for fans and media, when league employees actually started to pitch head coaches on “Next Gen” statistics and technological advancements about four years ago, they were stunned at the reception.

“Where does it end?” Rivera said this week in an interview with The Ringer. “Can you get text messages or go out there with an iPhone and figure out where to go? What are we creating? I know there are millennial players, but this is still a game created 100 years ago.”
Rivera’s stance was among the most notable scenes during a spring in which it became increasingly clear that technology’s role in football has created a divide. On one side, there are coaches who have an old-school view of the craft; on the other are the coaches, executives, and owners who anticipate the sport undergoing the same sort of data revolution that most industries experienced long ago.
The NFL could have the technological capabilities to make a sideline look like a Blade Runner reboot. But it already has a mountain of data — it’s just that the mountain is largely inaccessible. In an effort to facilitate progress, league officials in Boca Raton pitched the NFL’s latest data technology: a system that would allow franchises to view player-tracking data for all 32 teams. If implemented, the technology would enable clubs to monitor every movement on the field for the first time, yielding raw data on player performance. For example: A team concerned about a slow cornerback could actually find out how much slower he is than Antonio Brown, who, according to data shared on a 2015 Thursday Night Football broadcast, posted a maximum speed of 21.8 mph during the season.
The proposal for teams to have access to all raw player-tracking data did not make it past the league’s Competition Committee, a group of team executives, owners, and coaches, according to an NFL official. Certain coaches griped about what might happen if other teams or the public had access to this data, and the committee told team representatives that it was too much, too soon, preventing the matter from reaching the teams for a vote at the March meeting.
“In other industries it is crazy to think you are going to limit innovation just to protect the people who aren’t ready,” said Brian Kopp, president of North America for Catapult Sports, which says it has deals with 19 NFL teams to provide practice data, but not game data. “Let’s make it all equally competitive, which is: You don’t figure it out, you start losing and you lose your job.”
Rivera, who’s coached the Panthers since 2011 and serves on a subcommittee of the NFL Competition Committee, said that introducing too much technology could “take the essence” out of the sport.
“I want to get beat on the field. I don’t want to get beat because someone used a tool or technology — that is not coaching at that point,” Rivera said. “I work all week, I’m preparing and kicking your ass. All of the sudden you see a piece of live video and you figure out, ‘Oh crap, that’s what he’s doing.’ And how fair is that?”
Two seasons ago, some NFL players began wearing two tiny chips in their shoulder pads during games. The program expanded to all players this past season, when Zebra Technologies, the company that produces the chips, also outfitted every stadium with receivers that decipher all movements on the field, measuring everything from player speed to how open a pass-catcher manages to get on a given play.
If you’re suddenly worried that you’re the only one missing out on crucial pieces of football analysis, rest assured, you’re not: Aside from a few nuggets sprinkled into television broadcasts, fans don’t have access to most league-wide data. More alarmingly, teams don’t have access to any league-wide data during the season, and, according to league officials, didn’t even get their own data from the 2015 season until three weeks ago.
Though football enthusiasts often praise the NFL for being forward-thinking, it has actually lagged behind other professional leagues amid an otherwise widespread analytics revolution, with a player-tracking section on NBA.com and MLB allowing the public to access its PITCHf/x data for research and modeling purposes. While NFL teams have hired analysts for front-office roles and external parties have created websites aimed at tracking advanced statistics for fans and media, when league employees actually started to pitch head coaches on “Next Gen” statistics and technological advancements about four years ago, they were stunned at the reception.
by Kevin Clark, The Ringer | Read more:
Image: Getty
Facial Recognition Will Soon End Your Anonymity
Nearly 250 million video surveillance cameras have been installed throughout the world, and chances are you’ve been seen by several of them today. Most people barely notice their presence anymore — on the streets, inside stores, and even within our homes. We accept the fact that we are constantly being recorded because we expect this to have virtually no impact on our lives. But this balance may soon be upended by advancements in facial recognition technology.
Soon anybody with a high-resolution camera and the right software will be able to determine your identity. That’s because several technologies are converging to make this accessible. Recognition algorithms have become far more accurate, the devices we carry can process huge amounts of data, and there’s massive databases of faces now available on social media that are tied to our real names. As facial recognition enters the mainstream, it will have serious implications for your privacy.
A new app called FindFace, recently released in Russia, gives us a glimpse into what this future might look like. Made by two 20-something entrepreneurs, FindFace allows anybody to snap a photo of a passerby and discover their real name — already with 70% reliability. The app allows people to upload photos and compare faces to user profiles from the popular social network Vkontakte, returning a result in a matter of seconds. According to an interview in the Guardian, the founders claim to already have 500,000 users and have processed over 3 million searches in the two months since they’ve launched.
What’s particularly unsettling are the use cases they advocate: identifying strangers to send them dating requests, helping government security agencies to determine the identities of dissenters, and allowing retailers to bombard you with advertisements based on what you look at in stores.
While there are reasons to be skeptical of their claims, FindFace is already being deployed in questionable ways. Some users have tried to identify fellow riders on the subway, while others are using the app to reveal the real names of porn actresses against their will. Powerful facial recognition technology is now in the hands of consumers to use how they please.
It’s not just Russians who have to worry about the implications of ubiquitous facial recognition. Whenever a technology becomes cheap and powerful, it begins to show up in the unlikeliest of places.
by Tarun Wadhwa, MarketWatch | Read more:
Image: via:

A new app called FindFace, recently released in Russia, gives us a glimpse into what this future might look like. Made by two 20-something entrepreneurs, FindFace allows anybody to snap a photo of a passerby and discover their real name — already with 70% reliability. The app allows people to upload photos and compare faces to user profiles from the popular social network Vkontakte, returning a result in a matter of seconds. According to an interview in the Guardian, the founders claim to already have 500,000 users and have processed over 3 million searches in the two months since they’ve launched.
What’s particularly unsettling are the use cases they advocate: identifying strangers to send them dating requests, helping government security agencies to determine the identities of dissenters, and allowing retailers to bombard you with advertisements based on what you look at in stores.
While there are reasons to be skeptical of their claims, FindFace is already being deployed in questionable ways. Some users have tried to identify fellow riders on the subway, while others are using the app to reveal the real names of porn actresses against their will. Powerful facial recognition technology is now in the hands of consumers to use how they please.
It’s not just Russians who have to worry about the implications of ubiquitous facial recognition. Whenever a technology becomes cheap and powerful, it begins to show up in the unlikeliest of places.
by Tarun Wadhwa, MarketWatch | Read more:
Image: via:
A Fishy Business
The story of the world’s best fish sauce begins, like so many others, with a son who just wanted to make his mother happy. Cuong Pham and his parents came to the United States from Saigon as refugees in 1979. They settled in northern California, where Cuong eventually became an engineer, spending 16 years with Apple. His mother, however, could never find the fish sauce (nuoc mam in Vietnamese) she remembered from Vietnam. Cuong’s family owned a fish-sauce factory; his uncle would bring his mother 20-litre cans of specially selected, just-for-family nuoc mam. In America she had to settle for commercial fish sauce, often the saltier Thai variety – designed, in the words of Andrea Nguyen, a cookbook author and proprietor of the indispensable Viet World Kitchen website, “for the lusty highs and lows of Thai food, [not] the rolling hills and valleys of Viet food”). So Cuong did what any son would do: he started his own fish-sauce company.
Fish sauce – the liquid produced from anchovies salted and left to ferment in the heat for months – has long repelled most Western palates. That is starting to change. It adds a savoury depth to soups and stocks that salt alone cannot provide. If soy sauce is a single trumpet played at full blast, fish sauce is a dozen bowed double-basses; and Cuong’s fish sauce is without parallel. And while he may have made it for expats like his mother, chefs across the Pacific and in Europe have grown to love it. (...)
It forms the chief protein source for millions, and is as central to the diverse cuisines of mainland South-East Asia as olive oil is to southern Italian and Levantine food. It goes by different names: nam pla in Thailand, tuk trey in Cambodia and patis in the Philippines. A similar condiment called garum featured in ancient Roman cuisine, and indeed south-west Italy still produces small amounts of colatura di alici, an anchovy liquid similar to nuoc mam. In other parts of South-East Asia, notably Myanmar and Cambodia, people eat fermented-fish pastes, which tend to be more assertive – often used as a central ingredient rather than a flavouring. These products wring value from abundant, tiny fish too small to eat on their own; like pickling, fish sauces preserve a bountiful harvest’s nutrition.
Fish sauce can repel first-timers: it often has an intensely fishy odour, especially in the cheaper varieties, with a rubbishy edge. But its flavour rounds and mellows with cooking. Eventually it becomes addictive, essential: I’m about as Vietnamese as a bagel, and I can’t imagine my kitchen without it. You can build a marinade for nearly any grilled thing – meat, fish or vegetable – around its umami sturdiness. Greens stir-fried with garlic, nuoc mam and a squeeze of lime or splash of white wine make a happy light lunch, served over steamed rice. Mix it with lime juice, sugar, water and perhaps some sliced chillies or chopped garlic, and it becomes nuoc cham, a dip that makes everything taste better (it pairs especially, if unconventionally, well with soft, watery fruits such as pineapple and strawberry).

It forms the chief protein source for millions, and is as central to the diverse cuisines of mainland South-East Asia as olive oil is to southern Italian and Levantine food. It goes by different names: nam pla in Thailand, tuk trey in Cambodia and patis in the Philippines. A similar condiment called garum featured in ancient Roman cuisine, and indeed south-west Italy still produces small amounts of colatura di alici, an anchovy liquid similar to nuoc mam. In other parts of South-East Asia, notably Myanmar and Cambodia, people eat fermented-fish pastes, which tend to be more assertive – often used as a central ingredient rather than a flavouring. These products wring value from abundant, tiny fish too small to eat on their own; like pickling, fish sauces preserve a bountiful harvest’s nutrition.
Fish sauce can repel first-timers: it often has an intensely fishy odour, especially in the cheaper varieties, with a rubbishy edge. But its flavour rounds and mellows with cooking. Eventually it becomes addictive, essential: I’m about as Vietnamese as a bagel, and I can’t imagine my kitchen without it. You can build a marinade for nearly any grilled thing – meat, fish or vegetable – around its umami sturdiness. Greens stir-fried with garlic, nuoc mam and a squeeze of lime or splash of white wine make a happy light lunch, served over steamed rice. Mix it with lime juice, sugar, water and perhaps some sliced chillies or chopped garlic, and it becomes nuoc cham, a dip that makes everything taste better (it pairs especially, if unconventionally, well with soft, watery fruits such as pineapple and strawberry).
by Jon Fasman, 1843/The Economist | Read more:
Image: Quinn Ryan Mattingly
$4.5 Billion to Zero
Elizabeth Holmes of Theranos: From $4.5 Billion To Nothing
Image: Glen Davis/Forbes
[ed. You think you had a rough day...?!]
Wednesday, June 1, 2016
Let Them Drown
... climate change isn’t just about things getting hotter and wetter: under our current economic and political model, it’s about things getting meaner and uglier.
Edward Said was no tree-hugger. Descended from traders, artisans and professionals, he once described himself as ‘an extreme case of an urban Palestinian whose relationship to the land is basically metaphorical’. In After the Last Sky, his meditation on the photographs of Jean Mohr, he explored the most intimate aspects of Palestinian lives, from hospitality to sports to home décor. The tiniest detail – the placing of a picture frame, the defiant posture of a child – provoked a torrent of insight from Said. Yet when confronted with images of Palestinian farmers – tending their flocks, working the fields – the specificity suddenly evaporated. Which crops were being cultivated? What was the state of the soil? The availability of water? Nothing was forthcoming. ‘I continue to perceive a population of poor, suffering, occasionally colourful peasants, unchanging and collective,’ Said confessed. This perception was ‘mythic’, he acknowledged – yet it remained.
If farming was another world for Said, those who devoted their lives to matters like air and water pollution appear to have inhabited another planet. Speaking to his colleague Rob Nixon, he once described environmentalism as ‘the indulgence of spoiled tree-huggers who lack a proper cause’. But the environmental challenges of the Middle East are impossible to ignore for anyone immersed, as Said was, in its geopolitics. This is a region intensely vulnerable to heat and water stress, to sea-level rise and to desertification. A recent paper in Nature Climate Change predicts that, unless we radically lower emissions and lower them fast, large parts of the Middle East will likely ‘experience temperature levels that are intolerable to humans’ by the end of this century. And that’s about as blunt as climate scientists get. Yet environmental issues in the region still tend to be treated as afterthoughts, or luxury causes. The reason is not ignorance, or indifference. It’s just bandwidth. Climate change is a grave threat but the most frightening impacts are in the medium term. And in the short term, there are always far more pressing threats to contend with: military occupation, air assault, systemic discrimination, embargo. Nothing can compete with that – nor should it attempt to try.
There are other reasons why environmentalism might have looked like a bourgeois playground to Said. The Israeli state has long coated its nation-building project in a green veneer – it was a key part of the Zionist ‘back to the land’ pioneer ethos. And in this context trees, specifically, have been among the most potent weapons of land grabbing and occupation. It’s not only the countless olive and pistachio trees that have been uprooted to make way for settlements and Israeli-only roads. It’s also the sprawling pine and eucalyptus forests that have been planted over those orchards, as well as over Palestinian villages, most notoriously by the Jewish National Fund, which, under its slogan ‘Turning the Desert Green’, boasts of having planted 250 million trees in Israel since 1901, many of them non-native to the region. In publicity materials, the JNF bills itself as just another green NGO, concerned with forest and water management, parks and recreation. It also happens to be the largest private landowner in the state of Israel, and despite a number of complicated legal challenges, it still refuses to lease or sell land to non-Jews.
I grew up in a Jewish community where every occasion – births and deaths, Mother’s Day, bar mitzvahs – was marked with the proud purchase of a JNF tree in the person’s honour. It wasn’t until adulthood that I began to understand that those feel-good faraway conifers, certificates for which papered the walls of my Montreal elementary school, were not benign – not just something to plant and later hug. In fact these trees are among the most glaring symbols of Israel’s system of official discrimination – the one that must be dismantled if peaceful co-existence is to become possible.
The JNF is an extreme and recent example of what some call ‘green colonialism’. But the phenomenon is hardly new, nor is it unique to Israel. There is a long and painful history in the Americas of beautiful pieces of wilderness being turned into conservation parks – and then that designation being used to prevent Indigenous people from accessing their ancestral territories to hunt and fish, or simply to live. It has happened again and again. A contemporary version of this phenomenon is the carbon offset. Indigenous people from Brazil to Uganda are finding that some of the most aggressive land grabbing is being done by conservation organisations. A forest is suddenly rebranded a carbon offset and is put off-limits to its traditional inhabitants. As a result, the carbon offset market has created a whole new class of ‘green’ human rights abuses, with farmers and Indigenous people being physically attacked by park rangers or private security when they try to access these lands. Said’s comment about tree-huggers should be seen in this context.
And there is more. In the last year of Said’s life, Israel’s so-called ‘separation barrier’ was going up, seizing huge swathes of the West Bank, cutting Palestinian workers off from their jobs, farmers from their fields, patients from hospitals – and brutally dividing families. There was no shortage of reasons to oppose the wall on human rights grounds. Yet at the time, some of the loudest dissenting voices among Israeli Jews were not focused on any of that. Yehudit Naot, Israel’s then environment minister, was more worried about a report informing her that ‘The separation fence … is harmful to the landscape, the flora and fauna, the ecological corridors and the drainage of the creeks.’ ‘I certainly don’t want to stop or delay the building of the fence,’ she said, but ‘I am disturbed by the environmental damage involved.’ As the Palestinian activist Omar Barghouti later observed, Naot’s ‘ministry and the National Parks Protection Authority mounted diligent rescue efforts to save an affected reserve of irises by moving it to an alternative reserve. They’ve also created tiny passages [through the wall] for animals.’
Perhaps this puts the cynicism about the green movement in context. People do tend to get cynical when their lives are treated as less important than flowers and reptiles. And yet there is so much of Said’s intellectual legacy that both illuminates and clarifies the underlying causes of the global ecological crisis, so much that points to ways we might respond that are far more inclusive than current campaign models: ways that don’t ask suffering people to shelve their concerns about war, poverty and systemic racism and first ‘save the world’ – but instead demonstrate how all these crises are interconnected, and how the solutions could be too. In short, Said may have had no time for tree-huggers, but tree-huggers must urgently make time for Said – and for a great many other anti-imperialist, postcolonial thinkers – because without that knowledge, there is no way to understand how we ended up in this dangerous place, or to grasp the transformations required to get us out. So what follows are some thoughts – by no means complete – about what we can learn from reading Said in a warming world.
Edward Said was no tree-hugger. Descended from traders, artisans and professionals, he once described himself as ‘an extreme case of an urban Palestinian whose relationship to the land is basically metaphorical’. In After the Last Sky, his meditation on the photographs of Jean Mohr, he explored the most intimate aspects of Palestinian lives, from hospitality to sports to home décor. The tiniest detail – the placing of a picture frame, the defiant posture of a child – provoked a torrent of insight from Said. Yet when confronted with images of Palestinian farmers – tending their flocks, working the fields – the specificity suddenly evaporated. Which crops were being cultivated? What was the state of the soil? The availability of water? Nothing was forthcoming. ‘I continue to perceive a population of poor, suffering, occasionally colourful peasants, unchanging and collective,’ Said confessed. This perception was ‘mythic’, he acknowledged – yet it remained.

There are other reasons why environmentalism might have looked like a bourgeois playground to Said. The Israeli state has long coated its nation-building project in a green veneer – it was a key part of the Zionist ‘back to the land’ pioneer ethos. And in this context trees, specifically, have been among the most potent weapons of land grabbing and occupation. It’s not only the countless olive and pistachio trees that have been uprooted to make way for settlements and Israeli-only roads. It’s also the sprawling pine and eucalyptus forests that have been planted over those orchards, as well as over Palestinian villages, most notoriously by the Jewish National Fund, which, under its slogan ‘Turning the Desert Green’, boasts of having planted 250 million trees in Israel since 1901, many of them non-native to the region. In publicity materials, the JNF bills itself as just another green NGO, concerned with forest and water management, parks and recreation. It also happens to be the largest private landowner in the state of Israel, and despite a number of complicated legal challenges, it still refuses to lease or sell land to non-Jews.
I grew up in a Jewish community where every occasion – births and deaths, Mother’s Day, bar mitzvahs – was marked with the proud purchase of a JNF tree in the person’s honour. It wasn’t until adulthood that I began to understand that those feel-good faraway conifers, certificates for which papered the walls of my Montreal elementary school, were not benign – not just something to plant and later hug. In fact these trees are among the most glaring symbols of Israel’s system of official discrimination – the one that must be dismantled if peaceful co-existence is to become possible.
The JNF is an extreme and recent example of what some call ‘green colonialism’. But the phenomenon is hardly new, nor is it unique to Israel. There is a long and painful history in the Americas of beautiful pieces of wilderness being turned into conservation parks – and then that designation being used to prevent Indigenous people from accessing their ancestral territories to hunt and fish, or simply to live. It has happened again and again. A contemporary version of this phenomenon is the carbon offset. Indigenous people from Brazil to Uganda are finding that some of the most aggressive land grabbing is being done by conservation organisations. A forest is suddenly rebranded a carbon offset and is put off-limits to its traditional inhabitants. As a result, the carbon offset market has created a whole new class of ‘green’ human rights abuses, with farmers and Indigenous people being physically attacked by park rangers or private security when they try to access these lands. Said’s comment about tree-huggers should be seen in this context.
And there is more. In the last year of Said’s life, Israel’s so-called ‘separation barrier’ was going up, seizing huge swathes of the West Bank, cutting Palestinian workers off from their jobs, farmers from their fields, patients from hospitals – and brutally dividing families. There was no shortage of reasons to oppose the wall on human rights grounds. Yet at the time, some of the loudest dissenting voices among Israeli Jews were not focused on any of that. Yehudit Naot, Israel’s then environment minister, was more worried about a report informing her that ‘The separation fence … is harmful to the landscape, the flora and fauna, the ecological corridors and the drainage of the creeks.’ ‘I certainly don’t want to stop or delay the building of the fence,’ she said, but ‘I am disturbed by the environmental damage involved.’ As the Palestinian activist Omar Barghouti later observed, Naot’s ‘ministry and the National Parks Protection Authority mounted diligent rescue efforts to save an affected reserve of irises by moving it to an alternative reserve. They’ve also created tiny passages [through the wall] for animals.’
Perhaps this puts the cynicism about the green movement in context. People do tend to get cynical when their lives are treated as less important than flowers and reptiles. And yet there is so much of Said’s intellectual legacy that both illuminates and clarifies the underlying causes of the global ecological crisis, so much that points to ways we might respond that are far more inclusive than current campaign models: ways that don’t ask suffering people to shelve their concerns about war, poverty and systemic racism and first ‘save the world’ – but instead demonstrate how all these crises are interconnected, and how the solutions could be too. In short, Said may have had no time for tree-huggers, but tree-huggers must urgently make time for Said – and for a great many other anti-imperialist, postcolonial thinkers – because without that knowledge, there is no way to understand how we ended up in this dangerous place, or to grasp the transformations required to get us out. So what follows are some thoughts – by no means complete – about what we can learn from reading Said in a warming world.
by Naomi Klein, LRB | Read more:
Image: uncredited
Alone in the Wilderness
One weekend my father in law flew us out in his floatplane to fish for lake trout at the confluence of Upper and Lower Twin Lakes in Lake Clark National Preserve, one of the most stunningly beautiful parks in Alaska. The lakes themselves were crystal blue-green and so placid he had to drop rocks out the window on approach (rocks that he kept stashed just for this kind of occasion) to create ripples and enough depth perception to land safely on the mirrored, glassy surface. Once we were down and unloaded he left us there with our camping gear (but no food or water) and took off again to fly back to Kenai to pick up the rest of family, who were waiting with the other supplies we'd need to camp for several days. Unfortunately, the weather soon turned bad and we ended up cold and miserable in a steady driving rain, roasting freshly caught lake trout on alder sticks over a fizzling fire. After several hours of misery, we kept desperately hoping to hear the low drone of a plane coming in from the distance, but no luck. That's when we went exploring for shelter in the surrounding woods (minus any firearms for bears) and found Dick's cabin. We figured if worse came to worse, at least there'd be that. Finally, late in the day, the little PA-14 came humming over the horizon and we were saved from a cold and hungry night (and having to break into Dick's cabin).
But there's more...
It seemed like we were saved, but what we didn't know was how little fuel the PA-14 had left after trying to fly to Kenai and back (and getting socked in, trying over and over to get through dense, low lying clouds). After taking off from Twin Lakes and buzzing along for half an hour we were lulled into sleep by the steady drone of the engine... when suddenly everything went silent. We were in the middle of Lake Clark Pass (one of the most trecherous Passes in Alaska) and out of fuel. I watched my father in law switch tanks and try to restart the engine. It coughed back to life. We flew on a little longer, weaving down a narrow canyon, searching out possible landing sites along the rocky Chilikadrotna River below (there were none). Then the engine went dead again. This time he switched the radio to an emergency frequency and waggled the wings back and forth (where the fuel tanks are located) so any remaining gas would run down into the engine. Miraculously, after several tries, the engine caught again (on fumes), and, with a few minutes left, we were able to get through the Pass and out over Cook Inlet where Kalgin Island (several miles away) lay in the distance. The engine gave one last gasp and finally died for good, but by now we were high enough over the ocean that we could glide the remaining distance and land on the island's only lake. Once we touched down, my father in law opened the pilot's door, got out on the floats and paddled us the rest of the way to shore. He disappeared into the woods without a word and we didn't dare ask where he was going - he wasn't saying much by then - but twenty minutes later he emerging with two five-gallon containers of Av-gas that he'd stashed some years earlier for just such an occasion. We poured them into the fuel tanks and were off again, finally making it home. We even beat the rest of the family who, still driving back from Kenai, arrived a couple hours later. What a trip. Long story short, I never did get to see the inside of Dick Proenneke's cabin (although, I wish I had). But the fishing was great!]
by markk
The Term 'Oriental' is Outdated, Not Racist
[ed. I agree with the author.]
It is now politically incorrect to use the word “Oriental,” and the admonition has the force of law: President Obama recently signed a bill prohibiting use of the term in all federal documents. Rep. Grace Meng, the New York congresswoman who sponsored the legislation, exulted that “at long last this insulting and outdated term will be gone for good.”
As an Oriental, I am bemused. Apparently Asians are supposed to feel demeaned if someone refers to us as Orientals. But good luck finding a single Asian American who has ever had the word spat at them in anger. Most Asian Americans have had racist epithets hurled at them at one time or another: Chink, slant eye, gook, Nip, zipperhead. But Oriental isn’t in the canon.
And why should it be? Literally, it means of the Orient or of the East, as opposed to of the Occident or of the West. Last I checked, geographic origin is not a slur. If it were, it would be wrong to label people from Mississippi as Southerners.
Of course I understand that some insults have benign origins. “Jap,” for example, is simply a shortening of the word Japanese, but that one stings. As 127,000 Japanese Americans were carted off to internment camps during World War II, they were repeatedly referred to by their fellow citizens and the media as Japs. It was meant as an insult and understood as such. Clearly context is important.
The problem with “Oriental,” San Francisco Chronicle columnist Jeff Yang told NPR, is that “When you think about it, the term … feels freighted with luggage. You know, it’s a term which you can’t think of without having that sort of the smell of incense and the sound of a gong kind of in your head.” In other words it makes Asians sound exotic because it was in circulation at a time when exoticizing stereotypes were prevalent.
Erika Lee, director of the Immigration History Research Center at the University of Minnesota and author of “The Making of Asian America: A History,” offered a similar explanation to NBC News: “In the U.S., the term ‘Oriental’ has been used to reinforce the idea that Asians were/are forever foreign and could never become American. These ideas helped to justify immigration exclusion, racial discrimination and violence, political disfranchisement and segregation.” Lee also claimed that continued use of the term “perpetuates inequality, disrespect, discrimination and stereotypes towards Asian Americans.”
I don’t see it that way; I see self-righteous, fragile egos eager to find offense where none is intended. A wave of anti-Oriental discrimination is not sweeping the country. Besides, the term has been steadily falling out of circulation since the 1950s, and it’s mainly used today by older Asians and the proprietors of hundreds if not thousands of restaurants, hotels, shops and organizations with Oriental in their name. The well-intentioned meddlers will create trouble for exactly the population they want to defend.
My profession, Oriental medicine, is among those on the receiving end of the identity-politics outbreak. A funny thing I noticed is that my Caucasian (dare I say Occidental?) colleagues, not my Asian colleagues, are most eager to remove Oriental from public discourse. I suppose they’re busy shouldering their burden of guilt. Margaret Cho said it best: “White people like to tell Asians how to feel about race because they’re too scared to tell black people.”
by Jayne Tsuchiyama, LA Times | Read more:
Image: Jeff Christensen /AP
It is now politically incorrect to use the word “Oriental,” and the admonition has the force of law: President Obama recently signed a bill prohibiting use of the term in all federal documents. Rep. Grace Meng, the New York congresswoman who sponsored the legislation, exulted that “at long last this insulting and outdated term will be gone for good.”
As an Oriental, I am bemused. Apparently Asians are supposed to feel demeaned if someone refers to us as Orientals. But good luck finding a single Asian American who has ever had the word spat at them in anger. Most Asian Americans have had racist epithets hurled at them at one time or another: Chink, slant eye, gook, Nip, zipperhead. But Oriental isn’t in the canon.

Of course I understand that some insults have benign origins. “Jap,” for example, is simply a shortening of the word Japanese, but that one stings. As 127,000 Japanese Americans were carted off to internment camps during World War II, they were repeatedly referred to by their fellow citizens and the media as Japs. It was meant as an insult and understood as such. Clearly context is important.
The problem with “Oriental,” San Francisco Chronicle columnist Jeff Yang told NPR, is that “When you think about it, the term … feels freighted with luggage. You know, it’s a term which you can’t think of without having that sort of the smell of incense and the sound of a gong kind of in your head.” In other words it makes Asians sound exotic because it was in circulation at a time when exoticizing stereotypes were prevalent.
Erika Lee, director of the Immigration History Research Center at the University of Minnesota and author of “The Making of Asian America: A History,” offered a similar explanation to NBC News: “In the U.S., the term ‘Oriental’ has been used to reinforce the idea that Asians were/are forever foreign and could never become American. These ideas helped to justify immigration exclusion, racial discrimination and violence, political disfranchisement and segregation.” Lee also claimed that continued use of the term “perpetuates inequality, disrespect, discrimination and stereotypes towards Asian Americans.”
I don’t see it that way; I see self-righteous, fragile egos eager to find offense where none is intended. A wave of anti-Oriental discrimination is not sweeping the country. Besides, the term has been steadily falling out of circulation since the 1950s, and it’s mainly used today by older Asians and the proprietors of hundreds if not thousands of restaurants, hotels, shops and organizations with Oriental in their name. The well-intentioned meddlers will create trouble for exactly the population they want to defend.
My profession, Oriental medicine, is among those on the receiving end of the identity-politics outbreak. A funny thing I noticed is that my Caucasian (dare I say Occidental?) colleagues, not my Asian colleagues, are most eager to remove Oriental from public discourse. I suppose they’re busy shouldering their burden of guilt. Margaret Cho said it best: “White people like to tell Asians how to feel about race because they’re too scared to tell black people.”
by Jayne Tsuchiyama, LA Times | Read more:
Image: Jeff Christensen /AP
Subscribe to:
Posts (Atom)