Friday, July 15, 2011
Friday Book Club - The Liar's Club
Imagine you are a child of 7 and this is your sharpest memory: "Our family doctor knelt before me where I sat on a mattress on the bare floor. . . . He was pulling at the hem of my favorite nightgown. . . . 'Show me the marks,' he said. 'Come on, now. I won't hurt you.' "
Thus opens "The Liars' Club," Mary Karr's haunting memoir of growing up in East Texas in the early 1960's, virtually motherless, and fiercely seeking to understand her parents, their lives and their relationship to her sister and herself.
Daddy drank every day, but "he never missed a day of work in 42 years at the plant; never cried -- on the morning after -- that he felt some ax wedged in his forehead; never drew his belt from his pant loops to strap on us or got weepy over cowboy songs the way some guys down at the Legion did." Mother was a different story. "Looking back from this distance, I can also see Mother trapped in some way, stranded in her own silence. How small she seems in her silk dress, drinking stale coffee."
A reader could conclude that no one speaks in this memoir except the narrator, and that would be almost true. But even mute, this mother is the story; give or take a few exceptions, she's the whole story. Charlie Marie Moore Karr, a k a Mother, is a huge enigma that by her very presence, her silent, raging sadness and fierce passions dominates the family. She is an enigma not only to her daughters and husband, but to the set of children whom she abandoned years before giving birth to Mary and her older sister, Lecia, and whose existence she has held as a corrosive secret. And she has remained an enigma to everyone, including the six men she has married and divorced, even Daddy, J. P. Karr, whom she married twice.
The Liars' Club turns out to be just a place where the men meet on their days off to play dominoes and drink in the back room of the bait shop. Mary Karr's father is mainly just a regular guy. It is her mother who takes on enormous, suffocating dimension.
As Mother rarely speaks, it is left to the imagination of the daughters to attempt to translate her silences. While Daddy, who works in the oilfields of Leechfield, where Agent Orange is manufactured, has a sweet steady Texas grit, Mother has what her daughter calls East Coast longings. She is too refined for Texas, and is "adjudged more or less permanently Nervous." Born in West Texas, she had gone to New York, where she spent her youth and first marriages and went to the opera and to museums. Back in East Texas, she reads Camus and Sartre and tries to throw herself out of speeding cars while drunk.
In Mary's eyes, the most admirable thing about Charlie is that she's a painter. Daddy and his card-playing buddies in the Liars' Club build her a studio in the back of their house, and the first thing she paints on her visits home from caring for her own mother is "a portrait of Grandma . . . from a Polaroid taken just before Grandma lost the leg."
Shortly before the major catastrophe that's about to happen to these girls, Ms. Karr notes, "I see Mother's face wearing that thousand-yard stare. . . . The back door she's staring through opens on a wet black night." Charlie is immeasurably, palpably sad. Her art, in the end, is not enough to hold her -- nor is any art. She just reads Tolstoy, plays old Bessie Smith records and cries.
Read more:
Thus opens "The Liars' Club," Mary Karr's haunting memoir of growing up in East Texas in the early 1960's, virtually motherless, and fiercely seeking to understand her parents, their lives and their relationship to her sister and herself.Daddy drank every day, but "he never missed a day of work in 42 years at the plant; never cried -- on the morning after -- that he felt some ax wedged in his forehead; never drew his belt from his pant loops to strap on us or got weepy over cowboy songs the way some guys down at the Legion did." Mother was a different story. "Looking back from this distance, I can also see Mother trapped in some way, stranded in her own silence. How small she seems in her silk dress, drinking stale coffee."
A reader could conclude that no one speaks in this memoir except the narrator, and that would be almost true. But even mute, this mother is the story; give or take a few exceptions, she's the whole story. Charlie Marie Moore Karr, a k a Mother, is a huge enigma that by her very presence, her silent, raging sadness and fierce passions dominates the family. She is an enigma not only to her daughters and husband, but to the set of children whom she abandoned years before giving birth to Mary and her older sister, Lecia, and whose existence she has held as a corrosive secret. And she has remained an enigma to everyone, including the six men she has married and divorced, even Daddy, J. P. Karr, whom she married twice.
The Liars' Club turns out to be just a place where the men meet on their days off to play dominoes and drink in the back room of the bait shop. Mary Karr's father is mainly just a regular guy. It is her mother who takes on enormous, suffocating dimension.
As Mother rarely speaks, it is left to the imagination of the daughters to attempt to translate her silences. While Daddy, who works in the oilfields of Leechfield, where Agent Orange is manufactured, has a sweet steady Texas grit, Mother has what her daughter calls East Coast longings. She is too refined for Texas, and is "adjudged more or less permanently Nervous." Born in West Texas, she had gone to New York, where she spent her youth and first marriages and went to the opera and to museums. Back in East Texas, she reads Camus and Sartre and tries to throw herself out of speeding cars while drunk.
In Mary's eyes, the most admirable thing about Charlie is that she's a painter. Daddy and his card-playing buddies in the Liars' Club build her a studio in the back of their house, and the first thing she paints on her visits home from caring for her own mother is "a portrait of Grandma . . . from a Polaroid taken just before Grandma lost the leg."
Shortly before the major catastrophe that's about to happen to these girls, Ms. Karr notes, "I see Mother's face wearing that thousand-yard stare. . . . The back door she's staring through opens on a wet black night." Charlie is immeasurably, palpably sad. Her art, in the end, is not enough to hold her -- nor is any art. She just reads Tolstoy, plays old Bessie Smith records and cries.
Read more:
Everyone's So Smart Now!
by Catherine Rampell
We’ve written before about some of the work of Stuart Rojstaczer and Christopher Healy, grade inflation chroniclers extraordinaire. They have put together a new, comprehensive study of college grading over the decades, and let me tell you, it is a doozy.
The researchers collected historical data on letter grades awarded by more than 200 four-year colleges and universities. Their analysis (published in the Teachers College Record) confirm that the share of A grades awarded has skyrocketed over the years. Take a look at the red line in the chart below, which refers to the share of grades given that are A’s:
Stuart Rojstaczer and Christopher Healy Note: 1940 and 1950 (nonconnected data points in figure) represent averages from 1935 to 1944 and 1945 to 1954, respectively. Data from 1960 onward represent annual averages in their database, smoothed with a three-year centered moving average.
Most recently, about 43 percent of all letter grades given were A’s, an increase of 28 percentage points since 1960 and 12 percentage points since 1988. The distribution of B’s has stayed relatively constant; the growing share of A’s instead comes at the expense of a shrinking share of C’s, D’s and F’s. In fact, only about 10 percent of grades awarded are D’s and F’s.
As we have written before, private colleges and universities are by far the biggest offenders on grade inflation, even when you compare private schools to equally selective public schools. Here’s another chart showing the grading curves for public versus private schools in the years 1960, 1980 and 2007:
As you can see, public and private school grading curves started out as relatively similar, and gradually pulled further apart. Both types of institutions made their curves easier over time, but private schools made their grades much easier.
Read more:
ps. Thanks to Hairpin for the great title.
We’ve written before about some of the work of Stuart Rojstaczer and Christopher Healy, grade inflation chroniclers extraordinaire. They have put together a new, comprehensive study of college grading over the decades, and let me tell you, it is a doozy.
The researchers collected historical data on letter grades awarded by more than 200 four-year colleges and universities. Their analysis (published in the Teachers College Record) confirm that the share of A grades awarded has skyrocketed over the years. Take a look at the red line in the chart below, which refers to the share of grades given that are A’s:
Stuart Rojstaczer and Christopher Healy Note: 1940 and 1950 (nonconnected data points in figure) represent averages from 1935 to 1944 and 1945 to 1954, respectively. Data from 1960 onward represent annual averages in their database, smoothed with a three-year centered moving average.
Most recently, about 43 percent of all letter grades given were A’s, an increase of 28 percentage points since 1960 and 12 percentage points since 1988. The distribution of B’s has stayed relatively constant; the growing share of A’s instead comes at the expense of a shrinking share of C’s, D’s and F’s. In fact, only about 10 percent of grades awarded are D’s and F’s.
As we have written before, private colleges and universities are by far the biggest offenders on grade inflation, even when you compare private schools to equally selective public schools. Here’s another chart showing the grading curves for public versus private schools in the years 1960, 1980 and 2007:
Stuart Rojstaczer and Christopher Healy Note: 1960 and 1980 data represent averages from 1959–1961 and 1979–1981, respectively.
As you can see, public and private school grading curves started out as relatively similar, and gradually pulled further apart. Both types of institutions made their curves easier over time, but private schools made their grades much easier.
Read more:
ps. Thanks to Hairpin for the great title.
Thanks for Sharing
by Felix Salmon
Is there a company in the world which isn’t trying to “harness and leverage the power of social media to amplify our brand” or somesuch? I’m a pretty small fish in the Twitter pond, and I get asked on a very regular basis to talk to various marketing types about how they should be using Twitter. A smart organization with a big Twitter presence, then, will naturally start trying to leverage its ability to leverage Twitter by putting together sophisticated presentations full of “insights to help marketers align their content-sharing strategies” and the like. Which is exactly what the New York Times has just done.
The slideshow can be found here, and it’s worth downloading just to see how many photos the NYT art department could find of good-looking young people looking happy in minimalist houses. But it actually includes some interesting insights, too, which were spelled out at a conference yesterday by Brian Brett of the NYT Customer Research Group.
The survey claims to be the first of its kind on why people share content, which is a very good question. A large part of how people enjoy themselves online these days is by creating and sharing content, which is both exciting and a little bit scary for anybody in a media organization. And the NYT methodology was fun, too: aside from the standard surveys and interviews, they asked a bunch of people who don’t normally share much to spend a week sharing a lot; and they also asked a lot of heavy sharers to spend a week sharing nothing. (“It was like quitting smoking,” said one, “only harder”.)
The first striking insight is about the degree to which the act of sharing deepens understanding. It’s not at all surprising to learn that 85% of people say that they use other people’s responses to help them understand and process information — in fact 100% of people do that, and they’ve been doing it for centuries. We always react to news and information in large part by looking at how other people react to it.
But more interesting is the fact that 73% of people say that the simple act of sharing a piece of information with others makes them likely to process that information more deeply and thoughtfully. It’s like writing things down to remember them: the more you engage with something, the more important and salient it becomes to you.
Is there a company in the world which isn’t trying to “harness and leverage the power of social media to amplify our brand” or somesuch? I’m a pretty small fish in the Twitter pond, and I get asked on a very regular basis to talk to various marketing types about how they should be using Twitter. A smart organization with a big Twitter presence, then, will naturally start trying to leverage its ability to leverage Twitter by putting together sophisticated presentations full of “insights to help marketers align their content-sharing strategies” and the like. Which is exactly what the New York Times has just done.
The slideshow can be found here, and it’s worth downloading just to see how many photos the NYT art department could find of good-looking young people looking happy in minimalist houses. But it actually includes some interesting insights, too, which were spelled out at a conference yesterday by Brian Brett of the NYT Customer Research Group.
The survey claims to be the first of its kind on why people share content, which is a very good question. A large part of how people enjoy themselves online these days is by creating and sharing content, which is both exciting and a little bit scary for anybody in a media organization. And the NYT methodology was fun, too: aside from the standard surveys and interviews, they asked a bunch of people who don’t normally share much to spend a week sharing a lot; and they also asked a lot of heavy sharers to spend a week sharing nothing. (“It was like quitting smoking,” said one, “only harder”.)
The first striking insight is about the degree to which the act of sharing deepens understanding. It’s not at all surprising to learn that 85% of people say that they use other people’s responses to help them understand and process information — in fact 100% of people do that, and they’ve been doing it for centuries. We always react to news and information in large part by looking at how other people react to it.
But more interesting is the fact that 73% of people say that the simple act of sharing a piece of information with others makes them likely to process that information more deeply and thoughtfully. It’s like writing things down to remember them: the more you engage with something, the more important and salient it becomes to you.
Don't Be Evil
By Evgeny Morozov
July 13, 2011
In the Plex: How Google Thinks, Works, and Shapes Our Lives
By Steven Levy
The Googlization of Everything (And Why We Should Worry)
By Siva Vaidhyanathan
I.
For cyber-optimists and cyber-pessimists alike, the advent of Google marks off two very distinct periods in Internet history. The optimists remember the age before Google as chaotic, inefficient, and disorganized. Most search engines at the time had poor ethics (some made money by misrepresenting ads as search results) and terrible algorithms (some could not even find their parent companies online). All of that changed when two Stanford graduate students invented an ingenious way to rank Web pages based on how many other pages link to them. Other innovations spurred by Google—especially its novel platform for selling highly targeted ads—have created a new “ecosystem” (the optimists’ favorite buzzword) for producing and disseminating information. Thanks to Google, publishers of all stripes—from novice bloggers in New Delhi to media mandarins in New York—could cash in on their online popularity.
Cyber-pessimists see things quite differently. They wax nostalgic for the early days of the Web when discovery was random, and even fun. They complain that Google has destroyed the joy of serendipitous Web surfing, while its much-celebrated ecosystem is just a toxic wasteland of info-junk. Worse, it’s being constantly polluted by a contingent of “content farms” that produce trivial tidbits of information in order to receive a hefty advertising paycheck from the Googleplex. The skeptics charge that the company treats information as a commodity, trivializing the written word and seeking to turn access to knowledge into a dubious profit-center. Worst of all, Google’s sprawling technology may have created a digital panopticon, making privacy obsolete.
Both camps like to stress that Google is a unique enterprise that stands apart from the rest of Silicon Valley. The optimists do this to convince the public that the company’s motives are benign. If only we could bring ourselves to trust Google, their logic goes, its bright young engineers would deliver us the revolutionary services that we could never expect from our governments. The pessimists make a more intriguing case: for them, the company is so new, sly, and fluid, and the threats that it poses to society are so invisible, insidious, and monumental, that regulators may not yet have the proper analytical models to understand its true market and cultural power. That our anachronistic laws may be incapable of treating such a complex entity should not detract us from thwarting its ambitions.
These are not mutually exclusive positions. History is rife with examples of how benign and humanistic ideals can yield rather insidious outcomes—especially when backed by unchecked power and messianic rhetoric. The real question, then, is whether there is anything truly exceptional about Google’s principles, goals, and methods that would help it avoid this fate.
IS GOOGLE’S EXCEPTIONALISM genuine? On the surface, the answer seems self-evident. The company’s collegial working environment, its idealistic belief that corporations can make money without dirtying their hands, its quixotic quest to organize all of the world’s information, its founders’ contempt for marketing and conventional advertising— everything about the company screams, “We are special!” What normal company warns investors—on the very day of its initial public offering!—that it is willing to “forgo some short-term gains” in order to do “good things for the world”?
As Google’s ambitions multiply, however, its exceptionalism can no longer be taken for granted. Two new books shed light on this issue. Steven Levy had unrivaled access to Google’s executives, and In the Plex is a colorful journalistic account of the company’s history. Levy’s basic premise is that Google is both special and crucial, while the battle for its future is also a battle for the future of the Internet. As Levy puts it, “To understand this pioneering company and its people is to grasp our technological destiny.” What the German poet Friedrich Hebbel said of nineteenth-century Austria—that it is “a little world where the big one holds its tryouts”—also applies to Google. Siva Vaidhyanathan’s book is a far more intellectually ambitious project that seeks to document the company’s ecological footprint on the public sphere. Unlike Levy, Vaidhyanathan seeks to place Google’s meteoric rise and exceptionalism in the proper historical, cultural, and regulatory contexts, and suggests public alternatives to some of Google’s ambitious projects.
Even though both writers share the initial premise that, to quote Vaidhyanathan, Google is “nothing like anything we have seen before,” they provide different explanations of Google’s uniqueness. Levy opts for a “great man of history” approach and emphasizes the idealism and the quirkiness of its two founders. The obvious limitation of Levy’s method is that he pays very little attention to the broader intellectual context—the ongoing scholarly debates about the best approaches to information retrieval and the utility (and feasibility) of artificial intelligence—that must have shaped Google’s founders far more than the Montessori schooling system that so excites him.
Vaidhyanathan, while arguing that Google is “such a new phenomenon that old metaphors and precedents don’t fit the challenges the company presents to competitors and users,” posits that its power is mostly a function of recent developments in the information industry as well as of various market and public failures that occurred in the last few decades. Quoting the Marxist theorist David Harvey, Vaidhyanathan argues that the fall of communism in Eastern Europe and the resulting euphoria over “the end of history” and the triumph of neoliberalism has made the “notion of gentle, creative state involvement to guide processes toward the public good ... impossible to imagine, let alone propose.” Moreover, the growing penetration of market solutions into sectors that were traditionally managed by public institutions—from fighting wars to managing prisons and from schooling to health care—has made Google’s forays into digitizing books appear quite normal, set against the dismal state of public libraries and the continued sell-out of higher education to the highest corporate bidder. Thus Vaidhyanathan arrives at a rather odd and untenable conclusion: that Google is indeed exceptional—but its exceptionalism has little to do with Google.
Google’s two founders appear to firmly believe in their own exceptionalism. They are bold enough to think that the laws of sociology and organizational theory—for example, that most institutions, no matter how creative, are likely to end up in the “iron cage” of highly rationalized bureaucracy—do not apply to Google. This belief runs so deep that for a while they tried to run the company without middle managers—with disastrous results. Google’s embarrassing bouts of corporate autism—those increasingly frequent moments when the company is revealed to be out of touch with the outside world—stem precisely from this odd refusal to acknowledge its own normality. Time and again, its engineers fail to anticipate the loud public outcry over the privacy flaws in its products, not because they lack the technical knowledge to patch the related problems but because they have a hard time imagining an outside world where Google is seen as just another greedy corporation that might have incentives to behave unethically.
Read more:
July 13, 2011
In the Plex: How Google Thinks, Works, and Shapes Our Lives
By Steven Levy
The Googlization of Everything (And Why We Should Worry)
By Siva Vaidhyanathan
I.
For cyber-optimists and cyber-pessimists alike, the advent of Google marks off two very distinct periods in Internet history. The optimists remember the age before Google as chaotic, inefficient, and disorganized. Most search engines at the time had poor ethics (some made money by misrepresenting ads as search results) and terrible algorithms (some could not even find their parent companies online). All of that changed when two Stanford graduate students invented an ingenious way to rank Web pages based on how many other pages link to them. Other innovations spurred by Google—especially its novel platform for selling highly targeted ads—have created a new “ecosystem” (the optimists’ favorite buzzword) for producing and disseminating information. Thanks to Google, publishers of all stripes—from novice bloggers in New Delhi to media mandarins in New York—could cash in on their online popularity.
Cyber-pessimists see things quite differently. They wax nostalgic for the early days of the Web when discovery was random, and even fun. They complain that Google has destroyed the joy of serendipitous Web surfing, while its much-celebrated ecosystem is just a toxic wasteland of info-junk. Worse, it’s being constantly polluted by a contingent of “content farms” that produce trivial tidbits of information in order to receive a hefty advertising paycheck from the Googleplex. The skeptics charge that the company treats information as a commodity, trivializing the written word and seeking to turn access to knowledge into a dubious profit-center. Worst of all, Google’s sprawling technology may have created a digital panopticon, making privacy obsolete.
Both camps like to stress that Google is a unique enterprise that stands apart from the rest of Silicon Valley. The optimists do this to convince the public that the company’s motives are benign. If only we could bring ourselves to trust Google, their logic goes, its bright young engineers would deliver us the revolutionary services that we could never expect from our governments. The pessimists make a more intriguing case: for them, the company is so new, sly, and fluid, and the threats that it poses to society are so invisible, insidious, and monumental, that regulators may not yet have the proper analytical models to understand its true market and cultural power. That our anachronistic laws may be incapable of treating such a complex entity should not detract us from thwarting its ambitions.
These are not mutually exclusive positions. History is rife with examples of how benign and humanistic ideals can yield rather insidious outcomes—especially when backed by unchecked power and messianic rhetoric. The real question, then, is whether there is anything truly exceptional about Google’s principles, goals, and methods that would help it avoid this fate.
IS GOOGLE’S EXCEPTIONALISM genuine? On the surface, the answer seems self-evident. The company’s collegial working environment, its idealistic belief that corporations can make money without dirtying their hands, its quixotic quest to organize all of the world’s information, its founders’ contempt for marketing and conventional advertising— everything about the company screams, “We are special!” What normal company warns investors—on the very day of its initial public offering!—that it is willing to “forgo some short-term gains” in order to do “good things for the world”?
As Google’s ambitions multiply, however, its exceptionalism can no longer be taken for granted. Two new books shed light on this issue. Steven Levy had unrivaled access to Google’s executives, and In the Plex is a colorful journalistic account of the company’s history. Levy’s basic premise is that Google is both special and crucial, while the battle for its future is also a battle for the future of the Internet. As Levy puts it, “To understand this pioneering company and its people is to grasp our technological destiny.” What the German poet Friedrich Hebbel said of nineteenth-century Austria—that it is “a little world where the big one holds its tryouts”—also applies to Google. Siva Vaidhyanathan’s book is a far more intellectually ambitious project that seeks to document the company’s ecological footprint on the public sphere. Unlike Levy, Vaidhyanathan seeks to place Google’s meteoric rise and exceptionalism in the proper historical, cultural, and regulatory contexts, and suggests public alternatives to some of Google’s ambitious projects.
Even though both writers share the initial premise that, to quote Vaidhyanathan, Google is “nothing like anything we have seen before,” they provide different explanations of Google’s uniqueness. Levy opts for a “great man of history” approach and emphasizes the idealism and the quirkiness of its two founders. The obvious limitation of Levy’s method is that he pays very little attention to the broader intellectual context—the ongoing scholarly debates about the best approaches to information retrieval and the utility (and feasibility) of artificial intelligence—that must have shaped Google’s founders far more than the Montessori schooling system that so excites him.
Vaidhyanathan, while arguing that Google is “such a new phenomenon that old metaphors and precedents don’t fit the challenges the company presents to competitors and users,” posits that its power is mostly a function of recent developments in the information industry as well as of various market and public failures that occurred in the last few decades. Quoting the Marxist theorist David Harvey, Vaidhyanathan argues that the fall of communism in Eastern Europe and the resulting euphoria over “the end of history” and the triumph of neoliberalism has made the “notion of gentle, creative state involvement to guide processes toward the public good ... impossible to imagine, let alone propose.” Moreover, the growing penetration of market solutions into sectors that were traditionally managed by public institutions—from fighting wars to managing prisons and from schooling to health care—has made Google’s forays into digitizing books appear quite normal, set against the dismal state of public libraries and the continued sell-out of higher education to the highest corporate bidder. Thus Vaidhyanathan arrives at a rather odd and untenable conclusion: that Google is indeed exceptional—but its exceptionalism has little to do with Google.
Google’s two founders appear to firmly believe in their own exceptionalism. They are bold enough to think that the laws of sociology and organizational theory—for example, that most institutions, no matter how creative, are likely to end up in the “iron cage” of highly rationalized bureaucracy—do not apply to Google. This belief runs so deep that for a while they tried to run the company without middle managers—with disastrous results. Google’s embarrassing bouts of corporate autism—those increasingly frequent moments when the company is revealed to be out of touch with the outside world—stem precisely from this odd refusal to acknowledge its own normality. Time and again, its engineers fail to anticipate the loud public outcry over the privacy flaws in its products, not because they lack the technical knowledge to patch the related problems but because they have a hard time imagining an outside world where Google is seen as just another greedy corporation that might have incentives to behave unethically.
Read more:
Smile, You're On Everyone's Camera
by Farhad ManjooAccording to the Wall Street Journal, police departments across the nation will soon adopt handheld facial-recognition systems that will let them identify people with a snapshot. These new capabilities are made possible by BI2 Technologies, a Massachusetts company that has developed a small device that attaches to officers' iPhones. The police departments who spoke to the Journal said they plan to use the device only when officers suspect criminal activity and have no other way to identify a person—for instance, when they stop a driver who isn't carrying her license. Law enforcement officials also seemed wary about civil liberties concerns. Is snapping someone's photo from five feet away considered a search? Courts haven't decided the issue, but sheriffs who spoke to the paper say they plan to exercise caution.
Don't believe it. Soon, face recognition will be ubiquitous. While the police may promise to tread lightly, the technology is likely to become so good, so quickly that officers will find themselves reaching for their cameras in all kinds of situations. The police will still likely use traditional ID technologies like fingerprinting—or even iris scanning—as these are generally more accurate than face-scanning, but face-scanning has an obvious advantage over fingerprints: It works from far away. Bunch of guys loitering on the corner? Scantily clad woman hanging around that run-down motel? Two dudes who look like they're smoking a funny-looking cigarette? Why not snap them all just to make sure they're on the up-and-up?
Sure, this isn't a new worry. Early in 2001, police scanned the faces of people going to the Super Bowl, and officials rolled out the technology at Logan Airport in Boston after 9/11. Those efforts raised a stink, and the authorities decided to pull back. But society has changed profoundly in the last decade, and face recognition is now set to go mainstream. What's more, the police may be the least of your worries. In the coming years—if not months—we'll see a slew of apps that allow your friends and neighbors to snap your face and get your name and other information you've put online. This isn't a theoretical worry; the technology exists, now, to do this sort of thing crudely, and the only thing stopping companies from deploying it widely is a fear of public outcry. That fear won't last long. Face recognition for everyone is coming. Get used to it.
Read more:
Thursday, July 14, 2011
Why Not the Worst?
by Gene Weingarten
We promised to find the armpit of America. Turns out it's only about five inches from the heart.
My little puddle jumper begins its descent into Elko, a charmless city of 20,000 in the northern Nevada desert. Eighteen seats, all filled. This is not because Elko is a hot tourist attraction; it is because almost everyone else onboard belongs to a mariachi band. These guys have identical shiny blue suits and shiny blue shirts and shiny blue ties and shiny blue-black hair, like Rex Morgan in the comics, and they seem embarrassed to have accepted a gig in a place as tacky as Elko.Compared with my final destination, Elko is Florence during the Italian Renaissance.
When I tell the Elko rental car agent where I am headed, she laughs. Elkonians, who proudly sponsor a yearly civic event called the "Man-Mule Race," consider their neighbor 70 miles west to be an absolute clodhoppy riot.
"Don't sneeze," snorts the rental car woman, "or you'll miss it."
Yeah, I know. I went to Battle Mountain five weeks before, to see if it was dreadful enough to be anointed, officially, "The Armpit of America." I was exorbitantly convinced.
That first visit was in late August. This second one is in early October. In the interim, Everything Changed. With the nation united in mourning and at war, with the Stars and Stripes aflutter in places large and small, slick and hicky, the idea of poking fun at any one part of us became a great deal less funny. The zeitgeist had shifted. Snide was out.
I had to go back, to rethink things.
The road to Battle Mountain is flatter than any cliche -- even pancakes have a certain doughy topology. On this route, there is nothing. No curves. No trees. It is desert, but it is lacking any desert-type beauty. No cacti. No tumbleweeds. None of those spooky cow skulls. The only flora consists of nondescript scrub that resembles acre upon acre of toilet brushes buried to the hilt.
You know you have arrived at Battle Mountain because the town has marked its identity on a nearby hill in enormous letters fashioned from whitewashed rock.
I have returned to this place to find in it not America's armpit, but America's heart. I am here to mine the good in it, to tell the world that Battle Mountain doesn't stink. That is my new challenge.
I hang a right off the highway at the base of the hill, which proudly proclaims, in giant letters:
BM
Man. This is not going to be easy.
Take a small town, remove any trace of history, character, or charm. Allow nothing with any redeeming qualities within city limits -- this includes food, motel beds, service personnel. Then place this pathetic assemblage of ghastly buildings and nasty people on a freeway in the midst of a harsh, uninviting wilderness, far enough from the nearest city to be inconvenient, but not so far for it to develop a character of its own. You now have created Battle Mountain, Nevada.
Read more:
image credit:
Hands-on with Spotify
by Donald Bell You know something is good when it feels illegal. Such is the case with Spotify, the on-demand music-streaming service that seems too good to be true--or certainly, too good to be free. Yet, here it is, the "celestial jukebox" we've been dreaming of since the days of illegally gorging on the original Napster. It's called Spotify, it's finally available in the U.S., and music fans have reason to cheer.
What it does
What Spotify does is so simple and seemingly harmless, it's actually a sad comment on humanity that it counts as a groundbreaking product. As a first-time user, you install the free Spotify Mac/PC application, open it up, and watch as it automatically imports your music collection and playlists from iTunes and other music software and presents you with landing page filled with new releases, top lists, and music shared by your friends. The big trick, though, is a little search box at the top of the screen that lets you search for any reasonably popular artist, song, or album in existence and stream it immediately. You can't get The Beatles, but we had no problem finding greats like The Rolling Stones and David Bowie, as well as obscure indies such as The Ghastly Ones or Four Tet.
Put simply, you tell your computer what you want to hear, and it plays it for you...for free, and without limitations for up to 60 days. It doesn't play something similar to the song you want (like Pandora), or a 30- to 60-second clip of the song you want (like iTunes)--it plays you the whole song or album, just as if it were in your personal music collection.
Of course, there are a few other bells and whistles that make Spotify its own special thing. Facebook and Twitter integration allows you to easily share music discoveries with friends. Artist pages encourage discovery with bio pages and links out to similar artists and top hits of the decade to add context. Without any friction preventing you from jumping from one great song to the next, Spotify also provides a play queue off to the side, allowing you to stash your discoveries without interrupting the currently playing song.
And let's not forget the small but not insignificant matter of style. Spotify's polished, iTunes-like interface is as inviting to music fans as a well-stocked record bin. Each portion of the bento boxlike layout can be resized, and playback, volume, and track scrubber controls are placed neatly across the bottom. Browserlike back and forward buttons located to the left of the search box allow you to dig your way back out out of the rabbit's hole of music discovery.
The catch
Spotify's music service is uniquely generous, but it's not without limitations. Using the free version of the service, full songs can be streamed on-demand an unlimited amount for up to six months (with the occasional audio ad popping into rotation, similar to Pandora). After that time, free users can only play a given track a maximum of five times per month and are also subject to a cap of 10 hours of streaming per month. If you can cough up $5 per month, those restrictions (and ads) disappear, but you're still limited to only listening from your computer. At $10 per month, you can use Spotify on mobile devices (including iOS, Android, and Windows Phone 7), and even cache your favorite music and playlists for offline listening.
The Make-Believer
by Katrina Onstad
Miranda July stood in her living room in the Silver Lake neighborhood of Los Angeles, apologizing for the sunflowers. It really was a copious amount of sunflowers.
They sprouted from Mason jars and vases, punctuating the austere, Shaker-like furniture in the sunny home that July, who is 37, shares with her husband, the filmmaker Mike Mills, who’s 45. Noticing me noticing the sunflowers, she interjected: “We just had a party. We don’t usually have sunflowers everywhere.”
In person, July was very still, with ringlets of curly hair falling over her wide blue eyes like a protective visor, and she seemed perceptively aware of the “precious” label that is often attached both to her and to her work. At a different point in our time together, I followed her into a hotel room in San Francisco, where Mills had left her a knitted octopus wearing a scarf and hat on the couch. She laughed when she saw it but also appeared a bit mortified: “Oh, God,” she said. “It’s kind of a joke. . . . It’s not. . . . Really, this isn’t us at all.”
At their house, Mills emerged from his office; in contrast to July’s measured composure, Mills seemed in constant motion, often running his hands through his Beethoven hair. Both he and July have directed new films being released this summer. His film, “Beginners,” is loosely based on the true story of his father’s coming out at age 75. July’s film, “The Future,” is her second feature as a director, and it’s a funny, sad portrait of a couple at a crossroads. Sophie, played by July, works at a children’s dance school, and Jason, played by Hamish Linklater, provides tech-support by telephone from their sofa. The couple is weeks away from adopting Paw-Paw, an injured cat and a symbol of impending adulthood who is also the film’s narrator. A talking cat is exactly the kind of detail that might endear people who are endeared by Miranda July and infuriate people who are infuriated by her. There are plenty of both.
“You’ve met us at a weird time,” Mills said. “We’re usually just two workaholics in our separate corners.” July and Mills first crossed paths in 2005, when July’s debut feature, “Me and You and Everyone We Know,” made its premiere at Sundance at the same time as Mills’s film “Thumbsucker.” They met at a party — “She wore a yellow dress,” he recalls — and he watched her do a Q. and A. the next day. “She was so strong and declarative. I fell in love instantly.” They married in the summer of 2009 at Mills’s house in the Nevada hills.
In one sense, July has been enjoying the Platonic ideal of creative success in the age of the hyphenate artist. She publishes short stories in The New Yorker. The seven-year Web project, “Learning to Love You More,” which she produced with Harrell Fletcher — in which more than 8,000 people submitted material in response to online assignments like “Make a protest sign and protest” and “Take a picture of your parents kissing” — was recently acquired by the San Francisco Museum of Modern Art. “You and Me and Everyone We Know” won the Camera d’Or at Cannes and was named by Roger Ebert as one of the best films of the 2000s. She inspires a devotion among her fans that is positively swoony: “I love Miranda July, and everything she does is so subtle and sweet and bizarre and necessary,” is a fairly typical sentiment. July is preoccupied with intimacy — she habitually uses the words “you” and “we” in her titles — and this demands, and inspires, an intense engagement from her followers. After a screening of “The Future” at the San Francisco Film Festival, a small crowd surrounded July, pinning her against the back wall of the movie theater, wanting to tell her, with palpable urgency, how much her work mattered to them. Her office has an entire room filled top to bottom with boxes of letters and objects from fans around the world. One man printed every e-mail he ever wrote and sent them all to July, because only she would understand.
Yet despite this (or perhaps because of it) she has also become the unwilling exemplar of an aggravating boho archetype: the dreamy, young hipster whose days are filled with coffee, curios and disposable enchantments. “Yes, in some ways Miranda July is living my dream and life, and yes, maybe I’m a little jealous,” wrote one Brooklyn-based artist on her blog. “I loathe her. It feels personal.” To her detractors (“haters” doesn’t seem like too strong a word) July has come to personify everything infuriating about the Etsy-shopping, Wes Anderson-quoting, McSweeney’s-reading, coastal-living category of upscale urban bohemia that flourished in the aughts. Her very existence is enough to inspire, for example, an I Hate Miranda July blog, which purports to detest her “insufferable precious nonsense.” Or there is the online commenter who roots for July to be exiled to Darfur. Or the blogger who yearns to beat her with a shoe.
Read more:
Always On
Millions of people around the world are carrying smartphones and computer tablets that keep them constantly connected to the Internet. There are now more than 400,000 apps in Apple's online store — and 250,000 in Google's Android market — that allow their users to do hundreds of everyday tasks, all from the comfort of their handheld devices.
Constantly having access to these hundreds of thousands of applications has far-reaching implications for our society, says technology writer Brian X. Chen.
"Millions of us are carrying these devices that have a constant Internet connection and also access to hundreds of thousands of applications of very smart interfaces tailored to suit our everyday needs," he tells Fresh Air's Dave Davies. "If you think of that phenomenon [of being constantly connected], everything has to change: the way we do policing, the way we do education, [and] the way that we might treat medicine."
In Always On: How the iPhone Unlocked the Anything-Anytime-Anywhere Future — and Locked Us In, Chen examines what it means to have an uninterrupted connection to the Internet — and how smartphone-based applications will revolutionize the way we conduct our lives on- and offline.
Some of Chen's personal favorite applications, he says, are DropBox, which allows users to transfer files between computers and mobile devices; Uber, which helps hail black cars in San Francisco; and Pocket God, a game that allows users to manipulate a virtual island full of people.
Educational Applications
Some apps, he says, are not just fun — they may alter the way we relate and learn from one another.
For example, Abilene Christian University in Texas now gives every incoming freshman an iPhone and integrates the device into the curriculum.
"Instead of lecturing students and saying, 'Hey, open your textbook and go to page 96,' the teacher is acting as a guide and saying 'OK, so here's the topic we're going to discuss today. Take out your iPhone and go search on the web or search Wikipedia and let's have a conversation about where we want to take this discussion," Chen says.
He explains that students at Abilene are being taught the importance of discerning good data from bad data — and not just to blindly accept the information that would have been presented in a textbook.
"Abilene Christian is thinking forward and teaching people how to do ... a very important skill, because there's so much bad information out there on the Web," he says. "This is something they're experimenting with, and it's been successful, because students who are part of the iPhone program are actually getting better grades than the students who are taking comparable classes without iPhones."
Medical Applications
Some applications will revolutionize the ways doctors practice medicine and the ways patients interact with their physicians, Chen says.
Researchers at the University of Washington have taken initial steps to create a digital contact lens that would monitor vital signs in real time and provide instantaneous feedback to physicians through wireless radio connections.
"What's interesting about the eye is that the eye is like the little door into the body," Chen says. "And you can collect information about, say, cholesterol or glucose levels or blood pressure and transfer this information to a smartphone."
Currently, the researchers are testing their prototype contact lens on rabbits, but they hope to eventually integrate their designs into everyday eyewear.
"I think eventually we're going to see more of these technologies embedded into our bodies," he says. "Not just our eyes but maybe our hands and our feet, just listening to our vital signs so that we can get real-time feedback and keep good track of our health."
Law Applications
Police officers and lawyers also will benefit from having mobile apps always at the ready. "Some police officers are testing an application called Morris," Chen explains. "[It] allows officers to scan fingerprints of suspects and also scan their eyeballs and cross-reference that information with the database they have back at the police station."
Morris shaves hours off of an initial booking because police no longer have to drive suspects to a station for fingerprint analysis.
"It could help them make a lot more arrests that are accurate in the future," Chen says. "There are only a few stations that are testing this application [because] it costs $30,000, so it's unlikely we're going to see it anytime soon in every police officer's hands, but it's something we're working on to reduce costs and potentially streamline law enforcement a lot."
Constantly having access to these hundreds of thousands of applications has far-reaching implications for our society, says technology writer Brian X. Chen. "Millions of us are carrying these devices that have a constant Internet connection and also access to hundreds of thousands of applications of very smart interfaces tailored to suit our everyday needs," he tells Fresh Air's Dave Davies. "If you think of that phenomenon [of being constantly connected], everything has to change: the way we do policing, the way we do education, [and] the way that we might treat medicine."
In Always On: How the iPhone Unlocked the Anything-Anytime-Anywhere Future — and Locked Us In, Chen examines what it means to have an uninterrupted connection to the Internet — and how smartphone-based applications will revolutionize the way we conduct our lives on- and offline.
Some of Chen's personal favorite applications, he says, are DropBox, which allows users to transfer files between computers and mobile devices; Uber, which helps hail black cars in San Francisco; and Pocket God, a game that allows users to manipulate a virtual island full of people.
Educational Applications
Some apps, he says, are not just fun — they may alter the way we relate and learn from one another.
For example, Abilene Christian University in Texas now gives every incoming freshman an iPhone and integrates the device into the curriculum.
"Instead of lecturing students and saying, 'Hey, open your textbook and go to page 96,' the teacher is acting as a guide and saying 'OK, so here's the topic we're going to discuss today. Take out your iPhone and go search on the web or search Wikipedia and let's have a conversation about where we want to take this discussion," Chen says.
He explains that students at Abilene are being taught the importance of discerning good data from bad data — and not just to blindly accept the information that would have been presented in a textbook.
"Abilene Christian is thinking forward and teaching people how to do ... a very important skill, because there's so much bad information out there on the Web," he says. "This is something they're experimenting with, and it's been successful, because students who are part of the iPhone program are actually getting better grades than the students who are taking comparable classes without iPhones."
Medical Applications
Some applications will revolutionize the ways doctors practice medicine and the ways patients interact with their physicians, Chen says.
Researchers at the University of Washington have taken initial steps to create a digital contact lens that would monitor vital signs in real time and provide instantaneous feedback to physicians through wireless radio connections.
"What's interesting about the eye is that the eye is like the little door into the body," Chen says. "And you can collect information about, say, cholesterol or glucose levels or blood pressure and transfer this information to a smartphone."
Currently, the researchers are testing their prototype contact lens on rabbits, but they hope to eventually integrate their designs into everyday eyewear.
"I think eventually we're going to see more of these technologies embedded into our bodies," he says. "Not just our eyes but maybe our hands and our feet, just listening to our vital signs so that we can get real-time feedback and keep good track of our health."
Law Applications
Police officers and lawyers also will benefit from having mobile apps always at the ready. "Some police officers are testing an application called Morris," Chen explains. "[It] allows officers to scan fingerprints of suspects and also scan their eyeballs and cross-reference that information with the database they have back at the police station."
Morris shaves hours off of an initial booking because police no longer have to drive suspects to a station for fingerprint analysis.
"It could help them make a lot more arrests that are accurate in the future," Chen says. "There are only a few stations that are testing this application [because] it costs $30,000, so it's unlikely we're going to see it anytime soon in every police officer's hands, but it's something we're working on to reduce costs and potentially streamline law enforcement a lot."
Wednesday, July 13, 2011
The Midas Touch
by Lewis H. Lapham
Jesus answered, “It is written, ‘Man shall not live
on bread alone, but on every word that comes from the mouth of God.’”
—The Gospel According to Matthew
It is a hard matter, my fellow citizens, to argue with the belly, since it has no ears.
—Cato the Elder
In both the periodical and tabloid press these days, the discussion tends to dwell on the bread alone—its scarcity or abundance, its price, provenance, authenticity, presentation, calorie count, social status, political agenda, and carbon footprint. The celebrity guest on camera with Rachael Ray or an Iron Chef, the missing ingredient in the recipes for five-star environmental collapse. Either way, sous vide or sans tout, the preoccupation with food is front-page news, and in line with the current set of talking points, this issue of Lapham’s Quarterly offers various proofs of the proposition that the belly has no ears.
No ears but many friends and admirers, who spread out on the following pages a cornucopia of concerns about which I knew little or nothing before setting the table of contents. My ignorance I attribute to a coming of age in the America of the late 1940s, its cows grazing on grass, the citizenry fed by farmers growing unpatented crops. Accustomed to the restrictions imposed on the country’s appetite by the Second World War’s ration books, and raised in a Protestant household that didn’t give much thought to fine dining (one ate to live, one didn’t live to eat), I acquired a laissez-faire attitude toward food that I learn from Michael Pollan resembles that of the Australian koala. The koala contents itself with the eating of eucalyptus leaves, choosing to ignore whatever else shows up in or around its tree. Similarly, the few primitive tastes met with before my tenth birthday—peanut butter and jelly, creamed chicken and rice, the Fig Newton—have remained securely in place for the last sixty-six years, faith-based and conservative, apt to be viewed with suspicion at trendsetting New York restaurants, in one of which last winter my asking about the chance of seeing a baked or mashed potato prompted the waiter to remove the menu from my hand, gently but firmly retrieving the pearl from a swine. The judgment was served à la haute bourgeoisie, with a sprig of disdain and a drizzle of disgust. Thirty years ago I would have been surprised, but thirty years ago trendsetting restaurants hadn’t yet become art galleries, obesity wasn’t a crime, and at the airports there weren’t any Homeland Security agents confiscating Coca-Cola.
Times change, and with them what, where, and how people eat. In fifteenth-century London a man could be hanged for eating meat on Friday. An ancient Roman was expected to wear a wreath to a banquet. The potato in sixteenth-century Europe was believed to cause leprosy and syphilis. As of two years ago, 19 percent of America’s meals were being eaten in cars.
Prior to the twentieth century, the changes were relatively slow in coming. The text and illustration in this issue of the Quarterly reach across a span of four thousand years, during most of which time the global economy is agrarian. Man is the tenant of nature, food the measure of both his wealth and well-being. The earliest metal currencies (the shekel, the talent, the mina) represent weights and units of grain. Allowing for cultural difference and regional availability, the human family sits down to meals made of what it finds in the forest or grows in the field, the tables set from one generation to the next in accordance with the changing of the seasons and the benevolence of Ashnan or Ceres. To Achilles and Priam circa the twelfth century bc, Homer brings the meat “pierced…with spits,” the bread “in ample wicker baskets” with the same meaning and intent that Alexandre Dumas in nineteenth-century France imparts to the ripe fruit and the rare fish presented by the Count of Monte Cristo to Monsieur Danglars.
It is the sharing of the spoils of the hunt and the harvest, a public as opposed to a private good, that sustains the existence of the earliest human societies, sows the seeds of moral value, social contract, distributive justice, and holy ground. The communal sacrifice entitles all present to a portion of the bullock or the goat and therefore to a presence in the assembly and a finger in the pie. Table manners are the teaching of both politics and philosophy, for Roman emperors as for the Renaissance scholar Desiderius Erasmus and the Confucian Chinese.
The contract between man and nature remains in force for as long as it is understood which one is the tenant and which one the landlord. Over the course of millennia men discover numerous ways of upgrading their lot—cooking with fire, domesticating animals and plants, bringing the tomato from Mexico to Spain, pepper from Sumatra to Salem, constructing the chopstick, the seine net, and the salad fork—but the world’s population stays more or less in balance with the world’s agriculture because the landlord is careful about matching supply and demand. The sum of the world’s economic enterprise is how much or how little anybody gets to eat, the number of those present above and below the salt accounting for the margin of difference between a bull and a bear market. For thousands of years the four horsemen of the apocalypse, war and famine prominent among them, attend to the culling of the human herd. Europe in the fourteenth century doesn’t produce enough food to serve the increasingly large crowd of expectant guests. The Black Death reduces by a third the number of mouths to feed.
The contract between landlord and tenant doesn’t come up for review until the seventeenth-century plantings of capitalist finance give rise to the Industrial Revolution. Man comes to imagine that he holds the deed to nature, persuaded that if soundly managed as a commercial real estate venture, the property can be made to recruit larger armies, gather more votes, yield more cash. Add to the mechanical staples (John Deere’s cast-steel plow, Cyrus McCormick’s reaper) the twentieth century’s flavorings of laboratory science (chemical pesticides, synthetic gene sequences), and food becomes an industrial product subsumed into the body of a corporation.
So at least is my understanding from what I’m told by the news media and learn from the labels at the supermarket, which isn’t much because the message wrapped in cellophane holds with the Pentagon’s policy of don’t ask, don’t tell. I rely instead on Aristotle, who draws the distinction between wealth as food and wealth as money by pointing out that the stomach, although earless, is open to instruction and subject to restraint. A man can only eat so much (1,500 pounds of food per year according to current estimates), but the craving for money is boundless—the purse, not the belly, is the void that is never filled. Paul Roberts fits Aristotle’s observation to the modern circumstance: “Food production may follow general economic principles of supply and demand; it may indeed create employment, earn trade revenues, and generate profits, sometimes considerable profits; but the underlying product—the thing we eat—has never quite conformed to the rigors of the modern industrial model.”
Read more:
Having a Blast, 30 Years Later
by Claudine Zap
Thirty years ago, the first space shuttle launched into the stratosphere. Chris Bray and his father Kenneth watched -- and took a picture. Then last Friday, the shuttle Atlantis took its final trip. Again, the Bray men were there. And again, the two snapped a photo to capture the moment.
The side-by-side photos, which are up on Chris Bray's Flickr photostream, immediately went viral on the Web.
The first shot shows 13-year-old Chris with then 39-year-old dad looking through binoculars at the space shuttle Columbia's first launch on April 12, 1981, from the Kennedy Space Center.
The second snap comes three decades later and recreates the same moment at the last shuttle voyage. The young son is now an adult. His father is now gray-haired.
Chris Bray wrote on his Flickr page of the side-by-side images: "The picture we waited 30 years to complete."
The younger Bray told the Washington Post, "We've always loved that first photo. Taking a similar one for the last launch seemed like the perfect opportunity to celebrate the shuttle program and our relationship by putting the time passed in perspective, celebrating the interests we share, and illustrating the father/son bond we've maintained over the years."
The Brays' photo touched a chord of nostalgia in many rocket enthusiasts, and the pic has been viewed on Flickr an astronomical 510,000 times.
Comments on the pictures commend the melding of the personal with the historical. Says one: "Epic. To be able to share in something so wonderful with your dad, both beginning and end. I am jealous -- both that you watched not only the first but also the last mission -- but also that you did it with your father."
via:
Subscribe to:
Comments (Atom)






