My late colleague, Neil Postman, used to ask about any new proposal or technology, “What problem does it propose to solve?”
When it comes to Facebook, that problem was maintaining relationships over vast time and space. And the company has solved it, spectacularly. Along the way, as Postman would have predicted, it created many more problems.
Last week, Facebook revealed the leaders and first 20 members of its new review board. They are an august collection of some of the sharpest minds who have considered questions of free expression, human rights, and legal processes.
They represent a stratum of cosmopolitan intelligentsia quite well, while appearing to generate some semblance of global diversity. These distinguished scholars, lawyers, and activists are charged with generating high-minded deliberation about what is fit and proper for Facebook to host. It’s a good look for Facebook—as long as no one looks too closely.
What problems does the new Facebook review board propose to solve?
In an op-ed in The New York Times, the board’s new leadership declared: “The oversight board will focus on the most challenging content issues for Facebook, including in areas such as hate speech, harassment, and protecting people’s safety and privacy. It will make final and binding decisions on whether specific content should be allowed or removed from Facebook and Instagram (which Facebook owns).”
Only in the narrowest and most trivial of ways does this board have any such power. The new Facebook review board will have no influence over anything that really matters in the world.
It will hear only individual appeals about specific content that the company has removed from the service—and only a fraction of those appeals. The board can’t say anything about the toxic content that Facebook allows and promotes on the site. It will have no authority over advertising or the massive surveillance that makes Facebook ads so valuable. It won’t curb disinformation campaigns or dangerous conspiracies. It has no influence on the sorts of harassment that regularly occur on Facebook or (Facebook-owned) WhatsApp. It won’t dictate policy for Facebook Groups, where much of the most dangerous content thrives. And most importantly, the board will have no say over how the algorithms work and thus what gets amplified or muffled by the real power of Facebook.
This board has been hailed as a grand experiment in creative corporate governance. St. John’s University law professor Kate Klonick, the scholar most familiar with the process that generated this board, said, “This is the first time a private transnational company has voluntarily assigned a part of its policies to an external body like this.”
That’s not exactly the case. Industry groups have long practiced such self-regulation through outside bodies, with infamously mixed results. But there is no industry group to set standards and rules for Facebook. One-third of humanity uses the platform regularly. No other company has ever come close to having that level of power and influence. Facebook is an industry—and thus an industry group—unto itself. This is unprecedented, though, because Facebook ultimately controls the board, not the other way around.
We have seen this movie before. In the 1930s the Motion Picture Association of America, under the leadership of former US postmaster general Will Hays, instituted a strict code that prohibited major Hollywood studios from showing, among other things, “dances which emphasize indecent movements.” The code also ensured that “the use of the [US] flag shall be consistently respected.” By the 1960s, American cultural mores had broadened, and directors demanded more freedom to display sex and violence. So the MPAA abandoned the Hays code and adopted the ratings system familiar to American moviegoers (G, PG, PG-13, R, NC-17). (...)
When self-regulation succeeds at improving conditions for consumers, citizens, or workers, it does so by establishing deliberative bodies that can act swiftly and firmly, and generate clear, enforceable codes of conduct. If one movie studio starts dodging the ratings process, the MPAA and its other members can pressure theaters and other delivery channels to stop showing that studio’s films. The MPAA can also expel a studio, depriving it of the political capital generated by the association’s decades of campaign contributions and lobbying.
The Facebook board has no such power. It can’t generate a general code of conduct on its own, or consider worst-case scenarios to advise the company how to minimize the risk of harm. That would mean acting like a real advisory board. This one is neutered from the start because someone had the stupid idea that it should perform a quasi-judiciary role, examining cases one by one.
We know the process will be slow and plodding. Faux-judicial processes might seem deliberative, but they are narrow by design. The core attribute of the common law is conservatism. Nothing can change quickly. Law is set by courts through the act of cohering to previous decisions. Tradition and predictability are paramount values. So is stability for stability’s sake.
But on Facebook, as in global and ethnic conflict, the environment is tumultuous and changing all the time. Calls for mass violence spring up, seemingly out of nowhere. They take new forms as cultures and conditions shift. Facebook moves fast and breaks things like democracy. This review board is designed to move slowly and preserve things like Facebook.
by Siva Vaidhyanathan, Wired | Read more:
When it comes to Facebook, that problem was maintaining relationships over vast time and space. And the company has solved it, spectacularly. Along the way, as Postman would have predicted, it created many more problems.
Last week, Facebook revealed the leaders and first 20 members of its new review board. They are an august collection of some of the sharpest minds who have considered questions of free expression, human rights, and legal processes.
They represent a stratum of cosmopolitan intelligentsia quite well, while appearing to generate some semblance of global diversity. These distinguished scholars, lawyers, and activists are charged with generating high-minded deliberation about what is fit and proper for Facebook to host. It’s a good look for Facebook—as long as no one looks too closely.
What problems does the new Facebook review board propose to solve?
In an op-ed in The New York Times, the board’s new leadership declared: “The oversight board will focus on the most challenging content issues for Facebook, including in areas such as hate speech, harassment, and protecting people’s safety and privacy. It will make final and binding decisions on whether specific content should be allowed or removed from Facebook and Instagram (which Facebook owns).”
Only in the narrowest and most trivial of ways does this board have any such power. The new Facebook review board will have no influence over anything that really matters in the world.
It will hear only individual appeals about specific content that the company has removed from the service—and only a fraction of those appeals. The board can’t say anything about the toxic content that Facebook allows and promotes on the site. It will have no authority over advertising or the massive surveillance that makes Facebook ads so valuable. It won’t curb disinformation campaigns or dangerous conspiracies. It has no influence on the sorts of harassment that regularly occur on Facebook or (Facebook-owned) WhatsApp. It won’t dictate policy for Facebook Groups, where much of the most dangerous content thrives. And most importantly, the board will have no say over how the algorithms work and thus what gets amplified or muffled by the real power of Facebook.
This board has been hailed as a grand experiment in creative corporate governance. St. John’s University law professor Kate Klonick, the scholar most familiar with the process that generated this board, said, “This is the first time a private transnational company has voluntarily assigned a part of its policies to an external body like this.”
That’s not exactly the case. Industry groups have long practiced such self-regulation through outside bodies, with infamously mixed results. But there is no industry group to set standards and rules for Facebook. One-third of humanity uses the platform regularly. No other company has ever come close to having that level of power and influence. Facebook is an industry—and thus an industry group—unto itself. This is unprecedented, though, because Facebook ultimately controls the board, not the other way around.
We have seen this movie before. In the 1930s the Motion Picture Association of America, under the leadership of former US postmaster general Will Hays, instituted a strict code that prohibited major Hollywood studios from showing, among other things, “dances which emphasize indecent movements.” The code also ensured that “the use of the [US] flag shall be consistently respected.” By the 1960s, American cultural mores had broadened, and directors demanded more freedom to display sex and violence. So the MPAA abandoned the Hays code and adopted the ratings system familiar to American moviegoers (G, PG, PG-13, R, NC-17). (...)
When self-regulation succeeds at improving conditions for consumers, citizens, or workers, it does so by establishing deliberative bodies that can act swiftly and firmly, and generate clear, enforceable codes of conduct. If one movie studio starts dodging the ratings process, the MPAA and its other members can pressure theaters and other delivery channels to stop showing that studio’s films. The MPAA can also expel a studio, depriving it of the political capital generated by the association’s decades of campaign contributions and lobbying.
The Facebook board has no such power. It can’t generate a general code of conduct on its own, or consider worst-case scenarios to advise the company how to minimize the risk of harm. That would mean acting like a real advisory board. This one is neutered from the start because someone had the stupid idea that it should perform a quasi-judiciary role, examining cases one by one.
We know the process will be slow and plodding. Faux-judicial processes might seem deliberative, but they are narrow by design. The core attribute of the common law is conservatism. Nothing can change quickly. Law is set by courts through the act of cohering to previous decisions. Tradition and predictability are paramount values. So is stability for stability’s sake.
But on Facebook, as in global and ethnic conflict, the environment is tumultuous and changing all the time. Calls for mass violence spring up, seemingly out of nowhere. They take new forms as cultures and conditions shift. Facebook moves fast and breaks things like democracy. This review board is designed to move slowly and preserve things like Facebook.
by Siva Vaidhyanathan, Wired | Read more:
Image: Sam Whitney/Getty