Tuesday, November 15, 2016

Mark Zuckerberg Is in Denial

[ed. So much angst over the last election and people wondering/second-guessing now what to do to blunt the results or make sure nothing like this ever happens again. Here's a start: get off Facebook. It's poison disguised as community. It's corporate business disguised as your friend. It's crack for psychological vulnerabilities you never knew existed. Yes your real friends use it and it feels wonderful to be connected and share, but you're being used and manipulated and sold (to advertisers, Wall Street) and it's killing rational discourse and traditional news sources that actually spend money to produce (and fact check) the news that Facebook profits off of. Is that too hard, quitting Facebook? Then look in the mirror at one of the prinicipal sources of your discontent. Here's an idea: if you really want to protest this election start with Facebook, then start hitting all the other capitalist manipulators and elite one-percenters where it really matters: in their pocketbooks. Forget dumb marches, and ironic posterboard signs, and meaningless editorials, and reorganizing a failed political party and system that acts in no one's interests but its own (as an aside... it's amazing that younger generations who hold the reins of technology and chafe under Boomers' lingering influence still cling to the same outdated protest game plan). Get real. Bernie and Elizabeth or Joe or anyone else isn't going to come save you if they have to work within that system. Let's have rolling boycotts of Facebook, and Walmart, and Comcast, and GE, and Exxon, and the banks, and anything Koch brothers related, and all the other companies that are killing our economy. Organize it and select a different corporation each month - or two, or however long it takes to make a dent in their balance sheets. Believe me, that's the only way you're going to get anyone's attention. Then people might realize they don't need to work within the system to effect a true populist revolution (call it a crowdsourced revolution). It starts with sacrifice. But if you can't even make that effort then all the whining and hand-wringing in the world won't make a bit of difference. See also: Social Media's Globe-Shaking Power.]
Donald J. Trump’s supporters were probably heartened in September, when, according to an article shared nearly a million times on Facebook, the candidate received an endorsement from Pope Francis. Their opinions on Hillary Clinton may have soured even further after reading a Denver Guardian article that also spread widely on Facebook, which reported days before the election that an F.B.I. agent suspected of involvement in leaking Mrs. Clinton’s emails was found dead in an apparent murder-suicide.

There is just one problem with these articles: They were completely fake.

The pope, a vociferous advocate for refugees, never endorsed anyone. The Denver Guardian doesn’t exist. Yet thanks to Facebook, both of these articles were seen by potentially millions of people. Although corrections also circulated on the social network, they barely registered compared with the reach of the original fabrications.

This is not an anomaly: I encountered thousands of such fake stories last year on social media — and so did American voters, 44 percent of whom use Facebook to get news.

Mark Zuckerberg, Facebook’s chief, believes that it is “a pretty crazy idea” that “fake news on Facebook, which is a very small amount of content, influenced the election in any way.” In holding fast to the claim that his company has little effect on how people make up their minds, Mr. Zuckerberg is doing real damage to American democracy — and to the world.

He is also contradicting Facebook’s own research.

In 2010, researchers working with Facebook conducted an experiment on 61 million users in the United States right before the midterm elections. One group was shown a “go vote” message as a plain box, while another group saw the same message with a tiny addition: thumbnail pictures of their Facebook friends who had clicked on “I voted.” Using public voter rolls to compare the groups after the election, the researchers concluded that the second post had turned out hundreds of thousands of voters.

In 2012, Facebook researchers again secretly tweaked the newsfeed for an experiment: Some people were shown slightly more positive posts, while others were shown slightly more negative posts. Those shown more upbeat posts in turn posted significantly more of their own upbeat posts; those shown more downbeat posts responded in kind. Decades of other research concurs that people are influenced by their peers and social networks.

All of this renders preposterous Mr. Zuckerberg’s claim that Facebook, a major conduit for information in our society, has “no influence.”

The problem with Facebook’s influence on political discourse is not limited to the dissemination of fake news. It’s also about echo chambers. The company’s algorithm chooses which updates appear higher up in users’ newsfeeds and which are buried. Humans already tend to cluster among like-minded people and seek news that confirms their biases. Facebook’s research shows that the company’s algorithm encourages this by somewhat prioritizing updates that users find comforting. (...)

Content geared toward these algorithmically fueled bubbles is financially rewarding. That’s why YouTube has a similar feature in which it recommends videos based on what a visitor has already watched.

It’s also why, according to a report in BuzzFeed News, a bunch of young people in a town in Macedonia ran more than a hundred pro-Trump websites full of fake news. Their fabricated article citing anonymous F.B.I. sources claiming Hillary Clinton would be indicted, for example, got more than 140,000 shares on Facebook and may well have been viewed by millions of people since each share is potentially seen by hundreds of users. Even if each view generates only a fraction of a penny, that adds up to serious money.

Of course, fake news alone doesn’t explain the outcome of this election. People vote the way they do for a variety of reasons, but their information diet is a crucial part of the picture.

After the election, Mr. Zuckerberg claimed that the fake news was a problem on “both sides” of the race. There are, of course, viral fake anti-Trump memes, but reporters have found that the spread of false news is far more common on the right than it is on the left.

The Macedonian teenagers found this, too. They had experimented with left-leaning or pro-Bernie Sanders content, but gave up when they found it wasn’t as reliable a source of income as pro-Trump content. But even if Mr. Zuckerberg were right and fake news were equally popular on both sides, it would still be a profound problem.

Only Facebook has the data that can exactly reveal how fake news, hoaxes and misinformation spread, how much there is of it, who creates and who reads it, and how much influence it may have. Unfortunately, Facebook exercises complete control over access to this data by independent researchers. It’s as if tobacco companies controlled access to all medical and hospital records.

by Zeynep Tufekci, NY Times | Read more:
Image: Eric Risberg/Associated Press