Wednesday, July 13, 2016

How Technology Disrupted the Truth

Twenty-five years after the first website went online, it is clear that we are living through a period of dizzying transition. For 500 years after Gutenberg, the dominant form of information was the printed page: knowledge was primarily delivered in a fixed format, one that encouraged readers to believe in stable and settled truths.

Now, we are caught in a series of confusing battles between opposing forces: between truth and falsehood, fact and rumour, kindness and cruelty; between the few and the many, the connected and the alienated; between the open platform of the web as its architects envisioned it and the gated enclosures of Facebook and other social networks; between an informed public and a misguided mob.

What is common to these struggles – and what makes their resolution an urgent matter – is that they all involve the diminishing status of truth. This does not mean that there are no truths. It simply means, as this year has made very clear, that we cannot agree on what those truths are, and when there is no consensus about the truth and no way to achieve it, chaos soon follows.

Increasingly, what counts as a fact is merely a view that someone feels to be true – and technology has made it very easy for these “facts” to circulate with a speed and reach that was unimaginable in the Gutenberg era (or even a decade ago). (...)

In the digital age, it is easier than ever to publish false information, which is quickly shared and taken to be true Рas we often see in emergency situations, when news is breaking in real time. To pick one example among many, during the November 2015 Paris terror attacks, rumours quickly spread on social media that the Louvre and Pompidou Centre had been hit, and that Fran̤ois Hollande had suffered a stroke. Trusted news organisations are needed to debunk such tall tales.

Sometimes rumours like these spread out of panic, sometimes out of malice, and sometimes deliberate manipulation, in which a corporation or regime pays people to convey their message. Whatever the motive, falsehoods and facts now spread the same way, through what academics call an “information cascade”. As the legal scholar and online-harassment expert Danielle Citron describes it, “people forward on what others think, even if the information is false, misleading or incomplete, because they think they have learned something valuable.” This cycle repeats itself, and before you know it, the cascade has unstoppable momentum. You share a friend’s post on Facebook, perhaps to show kinship or agreement or that you’re “in the know”, and thus you increase the visibility of their post to others.

Algorithms such as the one that powers Facebook’s news feed are designed to give us more of what they think we want – which means that the version of the world we encounter every day in our own personal stream has been invisibly curated to reinforce our pre-existing beliefs. When Eli Pariser, the co-founder of Upworthy, coined the term “filter bubble” in 2011, he was talking about how the personalised web – and in particular Google’s personalised search function, which means that no two people’s Google searches are the same – means that we are less likely to be exposed to information that challenges us or broadens our worldview, and less likely to encounter facts that disprove false information that others have shared.

Pariser’s plea, at the time, was that those running social media platforms should ensure that “their algorithms prioritise countervailing views and news that’s important, not just the stuff that’s most popular or most self-validating”. But in less than five years, thanks to the incredible power of a few social platforms, the filter bubble that Pariser described has become much more extreme.

On the day after the EU referendum, in a Facebook post, the British internet activist and mySociety founder, Tom Steinberg, provided a vivid illustration of the power of the filter bubble – and the serious civic consequences for a world where information flows largely through social networks:
I am actively searching through Facebook for people celebrating the Brexit leave victory, but the filter bubble is SO strong, and extends SO far into things like Facebook’s custom search that I can’t find anyone who is happy *despite the fact that over half the country is clearly jubilant today* and despite the fact that I’m *actively* looking to hear what they are saying. 
This echo-chamber problem is now SO severe and SO chronic that I can only beg any friends I have who actually work for Facebook and other major social media and technology to urgently tell their leaders that to not act on this problem now is tantamount to actively supporting and funding the tearing apart of the fabric of our societies … We’re getting countries where one half just doesn’t know anything at all about the other.
But asking technology companies to “do something” about the filter bubble presumes that this is a problem that can be easily fixed – rather than one baked into the very idea of social networks that are designed to give you what you and your friends want to see.

Facebook, which launched only in 2004, now has 1.6bn users worldwide. It has become the dominant way for people to find news on the internet – and in fact it is dominant in ways that would have been impossible to imagine in the newspaper era. As Emily Bell has written: “Social media hasn’t just swallowed journalism, it has swallowed everything. It has swallowed political campaigns, banking systems, personal histories, the leisure industry, retail, even government and security.”

Bell, the director of the Tow Centre for Digital Journalism at Columbia University – and a board member of the Scott Trust, which owns the Guardian – has outlined the seismic impact of social media for journalism. “Our news ecosystem has changed more dramatically in the past five years,” she wrote in March, “than perhaps at any time in the past 500.” The future of publishing is being put into the “hands of the few, who now control the destiny of the many”. News publishers have lost control over the distribution of their journalism, which for many readers is now “filtered through algorithms and platforms which are opaque and unpredictable”. This means that social media companies have become overwhelmingly powerful in determining what we read – and enormously profitable from the monetisation of other people’s work. As Bell notes: “There is a far greater concentration of power in this respect than there has ever been in the past.”

Publications curated by editors have in many cases been replaced by a stream of information chosen by friends, contacts and family, processed by secret algorithms. The old idea of a wide-open web – where hyperlinks from site to site created a non-hierarchical and decentralised network of information – has been largely supplanted by platforms designed to maximise your time within their walls, some of which (such as Instagram and Snapchat) do not allow outward links at all.

Many people, in fact, especially teenagers, now spend more and more of their time on closed chat apps, which allow users to create groups to share messages privately –perhaps because young people, who are most likely to have faced harassment online, are seeking more carefully protected social spaces. But the closed space of a chat app is an even more restrictive silo than the walled garden of Facebook or other social networks.

As the pioneering Iranian blogger Hossein Derakhshan, who was imprisoned in Tehran for six years for his online activity, wrote in the Guardian earlier this year, the “diversity that the world wide web had originally envisioned” has given way to “the centralisation of information” inside a select few social networks – and the end result is “making us all less powerful in relation to government and corporations”.

Of course, Facebook does not decide what you read – at least not in the traditional sense of making decisions – and nor does it dictate what news organisations produce. But when one platform becomes the dominant source for accessing information, news organisations will often tailor their own work to the demands of this new medium. (The most visible evidence of Facebook’s influence on journalism is the panic that accompanies any change in the news feed algorithm that threatens to reduce the page views sent to publishers.)

In the last few years, many news organisations have steered themselves away from public-interest journalism and toward junk-food news, chasing page views in the vain hope of attracting clicks and advertising (or investment) – but like junk food, you hate yourself when you’ve gorged on it. The most extreme manifestation of this phenomenon has been the creation of fake news farms, which attract traffic with false reports that are designed to look like real news, and are therefore widely shared on social networks. But the same principle applies to news that is misleading or sensationally dishonest, even if it wasn’t created to deceive: the new measure of value for too many news organisations is virality rather than truth or quality.

by Katharine Viner, The Guardian |  Read more:
Image: : Sébastien Thibault