Oxford-Stanford report: Improving Facebook as forum for free speech and democracy

Gelo Gonzales

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Oxford-Stanford report: Improving Facebook as forum for free speech and democracy
'Making Facebook a better forum for free speech and democracy is a significant part of a wider struggle to defend those values across the world,' says the report

 

MANILA, Philippines – A joint report from the University of Oxford and Stanford University, released Thursday, January 17, suggested 9 ways Facebook could improve itself as a forum for free speech and democracy.

The report summarizes the issues that have plagued Facebook in recent years, its earlier hesitation to acknowledge its power to shape opinion and public conversation, and its responsibilities to its billions of users on the main Facebook platform as well as its subsidiaries such as WhatsApp and Instagram.

It then acknowledges that the platform has, more recently, become more proactive in making changes, starting with being more transparent, being more cooperative with governments, and ditching the troublesome motto of moving fast and breaking things. (READ: Will Facebook’s 2019 continue the failings of 2018?)

The report breaks down its suggestions across 3 categories: content policy and the moderation of free speech; the Facebook News Feed; and governance.

Content policy

For content policy, the report says one of the biggest problems, historically, is that users have known “very little about the fundamental rules of the road: what they are permitted to post, versus what is not allowed.”

Facebook’s policies have been criticized for lacking nuance, for being shrouded in secrecy, and for seemingly having this reactive quality: when a controversy happens, that’s when Facebook appears to hastily institute a policy.

Since then, Facebook has been more transparent, the report notes, publishing internal guidelines for Community Standards, enforcement data, set up an appeals process, and doubled the ranks of its content reviewers.

The report makes 4 suggestions moving forward:

1) Tighten the wording of its community standards on hate speech – “Clearer, more tightly drawn definitions” as wording may tend to remain broad, “leading to erratic, inconsistent, and often context-insensitive takedowns.”

2) Hire more and culturally expert content reviewers – Artificial intelligence (AI) is currently not adept enough to judge content that often needs regional and cultural context, so more cultural experts may need to oversee moderation operations. Hate speech is simply too complex, such that having “a single, inflexible worldwide set of standards” is “simply unfeasible,” says the report.

3) Increase decisional transparency – Facebook should “post and widely publicize case studies” on content moderation so that users may have a clearer understanding of why a certain enforcement or decision was made and what it has learned from a case.

4) Expand and improve the appeals process – While the report calls the current appeals process as a step in the right direction, it could be improved by more transparency, with Facebook providing information and numbers of how many pieces of content were appealed and the results of those appeals.  

The Facebook News Feed

The biggest problems remain fake news and disinformation campaigns. These two issues are still the subject of many academic studies and journalistic reports, centering usually on one of the platform’s pillars: the News Feed.

Among the changes that Facebook has implemented in hopes of combating these are fact-checking partnerships, transparency efforts for political ads, and a context button providing information about the publisher of a particular article post.

Here are the suggestions:

5) Provide meaningful News Feed controls for users – Users should be able to have more control over the type of information they see on the feed. They need to be able to see what friends and Pages are being prioritized on their feed, and then be provided with “buttons or sliders that allow users to control” whether they wish to see more content that doesn’t necessarily agree with their own ideologies; more news; or whether they want their feed to be algorithmically-based or just chronological.

6) Expand context and fact-checking facilities – Facebook has to invest in more “fact-checking infrastructure” in more countries. Currently, the fact checking network is in only 24 of the 100 countries that Facebook operates in. It also needs to dedicate more resources towards identifying the “most authoritative and trusted sources of contextual information for each country, region and culture.”

Governance

In spite of recent transparency efforts, the report says that “Facebook remains very difficult to study, meaning that it is very difficult for policymakers to be able to formulate evidence-based policy and truly understand the scope of the relevant problems.”

To improve, Facebook needs to expand its programs for collaborating with the public, governments, and academics – programs that will hopefully result in policies benefiting the greater good and not just their bottomline.

The report makes the following suggestions:

7) Establish regular auditing mechanisms – Audits by 3rd-party organizations. “Global audits should involve trusted third parties that assess practices, products, and algorithms for undesirable outcomes, and identify possible improvements. They should feature academic and civil society participation, also drawing on region-specific organisations and expertise,” the report says.

8) Create an external content policy advisory group – Facebook needs an expert advisory group made up of civil society and academic experts, and journalists, among other key stakeholders, and use their feedback to improve its content policies and standards. Just 2 years ago, content policy was shaped by a small group internally, which has since been expanded by Facebook to include experts.

9) Establish a meaningful external appeals body – “There should be some form of independent, external control of both the criteria and the actual decisions made” on Facebook’s appeal procedure, the report says. Done properly, having a third-party appeals body will bring crucial “ external input and much-needed forms of due process to Facebook’s private governance.”

“Together, we need to work towards a coherent mix of appropriate government regulation and industry-wide, platform-specific, and product-specific self-regulation, thus furnishing a credible democratic alternative to the Chinese model of authoritarian ‘information sovereignty’ that is gaining traction far beyond China’s borders. Making Facebook a better forum for free speech and democracy is a significant part of a wider struggle to defend those values across the world,” the report concludes.

The report’s authors are Timothy Garton Ash, professor of European Studies at the University of Oxford; Robert Gorwa, a Doctor of Philosophy candidate in the Department of Politics and International Relations at the University of Oxford; and Danaë Metaxa, a PhD candidate in Computer Science and McCoy Center for Ethics in Society fellow at Stanford University.

Click here to view the full report. – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Clothing, Apparel, Person

author

Gelo Gonzales

Gelo Gonzales is Rappler’s technology editor. He covers consumer electronics, social media, emerging tech, and video games.