First, some background. In 1996, two US lawmakers, Representative Chris Cox from California and Senator Ron Wyden from Oregon, inserted a clause into the sprawling telecommunications bill that was then on its way through Congress. The clause eventually became section 230 of the Communications Decency Act and read: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
The motives of the two politicians were honourable: they had seen how providers of early web-hosting services had been held liable for damage caused by content posted by users over whom they had no control. It’s worth remembering that those were early days for the internet and Cox and Wyden feared that if lawyers had henceforth to crawl over everything hosted on the medium, then the growth of a powerful new technology would be crippled more or less from birth. And in that sense they were right.
What they couldn’t have foreseen, though, was that section 230 would turn into a get-out-of-jail card for some of the most profitable companies on the planet – such as Google, Facebook and Twitter, which built platforms enabling their users to publish anything and everything without the owners incurring legal liability for it. So far-reaching was the Cox-Wyden clause that a law professor eventually wrote a whole book about it, The Twenty-Six Words That Created the Internet. A bit hyperbolic, perhaps, but you get the idea.
Now spool forward to November 2015 when Nohemi Gonzalez, a young American studying in Paris, was gunned down in a restaurant by the Islamic State terrorists who murdered 129 other people that night. Her family sued Google, arguing that its YouTube subsidiary had used algorithms to push IS videos to impressionable viewers, using the information that the company had collected about them. Their petition seeking a supreme court review argues that “videos that users viewed on YouTube were the central manner in which IS enlisted support and recruits from areas outside the portions of Syria and Iraq which it controlled”.
The key thing about the Gonzalez suit, though, is not that YouTube should not be hosting IS videos (section 230 allows that) but that its machine-learning “recommendation” algorithms, which may push other, perhaps more radicalising, videos, renders it liable for the resulting damage. Or, to put it crudely, while YouTube may have legal protection for hosting whatever its users post on it, it does not – and should not – have protection for an algorithm that determines what they should view next.
This is dynamite for the social-media platforms because recommendation engines are the key to their prosperity. They are the power tools that increase the user “engagement” – keeping people on the platform to leave the digital trails (viewing, sharing, liking, retweeting, purchasing, etc) – that enable the companies to continually refine user profiles for targeted advertising. And make unconscionable profits from doing so. If the supreme court were to decide that these engines did not enjoy section 230 protection, then social media firms would suddenly find the world a much colder place.
by John Naughton, The Guardian | Read more:
Image: Eugene GarcĂa/EPA