Here's a story you've heard about the Internet: we trade our privacy for services. The idea is that your private information is less valuable to you than it is to the firms that siphon it out of your browser as you navigate the Web. They know what to do with it to turn it into value—for them and for you. This story has taken on mythic proportions, and no wonder, since it has billions of dollars riding on it.
But if it's a bargain, it's a curious, one-sided arrangement. To understand the kind of deal you make with your privacy a hundred times a day, please read and agree with the following:
By reading this agreement, you give Technology Review and its partners the unlimited right to intercept and examine your reading choices from this day forward, to sell the insights gleaned thereby, and to retain that information in perpetuity and supply it without limitation to any third party.Actually, the text above is not exactly analogous to the terms on which we bargain with every mouse click. To really polish the analogy, I'd have to ask this magazine to hide that text in the margin of one of the back pages. And I'd have to end it with This agreement is subject to change at any time. What we agree to participate in on the Internet isn't a negotiated trade; it's a smorgasbord, and intimate facts of your life (your location, your interests, your friends) are the buffet.
Why do we seem to value privacy so little? In part, it's because we are told to. Facebook has more than once overridden its users' privacy preferences, replacing them with new default settings. Facebook then responds to the inevitable public outcry by restoring something that's like the old system, except slightly less private. And it adds a few more lines to an inexplicably complex privacy dashboard.
Even if you read the fine print, human beings are awful at pricing out the net present value of a decision whose consequences are far in the future. No one would take up smoking if the tumors sprouted with the first puff. Most privacy disclosures don't put us in immediate physical or emotional distress either. But given a large population making a large number of disclosures, harm is inevitable. We've all heard the stories about people who've been fired because they set the wrong privacy flag on that post where they blew off on-the-job steam.
The risks increase as we disclose more, something that the design of our social media conditions us to do. When you start out your life in a new social network, you are rewarded with social reinforcement as your old friends pop up and congratulate you on arriving at the party. Subsequent disclosures generate further rewards, but not always. Some disclosures seem like bombshells to you ("I'm getting a divorce") but produce only virtual cricket chirps from your social network. And yet seemingly insignificant communications ("Does my butt look big in these jeans?") can produce a torrent of responses. Behavioral scientists have a name for this dynamic: "intermittent reinforcement." It's one of the most powerful behavioral training techniques we know about. Give a lab rat a lever that produces a food pellet on demand and he'll only press it when he's hungry. Give him a lever that produces food pellets at random intervals, and he'll keep pressing it forever.
How does society get better at preserving privacy online? As Lawrence Lessig pointed out in his book Code and Other Laws of Cyberspace, there are four possible mechanisms: norms, law, code, and markets.
by Cory Doctorow, MIT Technology Review | Read more:
Photo: Jonathan Worth | Creative Commons