Supervision methods of intermediary: Self-governance

Meng Xu
3 min readApr 4, 2021

The Internet provides people with a platform for freedom of speech, but it is also accompanied by illegal content such as online pornography, piracy, and online violence. When consumers turn their attention to state policy oversight, the government’s behaviour may be unconstitutional. Therefore, the self-governance of intermediary agencies may be the most effective way to resolve this contradiction. It is obvious that recent digital governance has gradually shifted from state-centrism to the autonomy of companies, individuals, or non-state actors. This article takes Facebook as an example to discuss the policy of platform supervision.

As the world changes, and as online speech changes, Facebook specifically set up a product & content policy team, which is at the forefront of an open, worldwide dialogue around the policies that balance expression and safety. This team is responsible for maintaining visibility into how to make the rules that govern Facebook. This includes guidelines for what people can post, what they can advertise, and how they can use other products, like fundraising tools or Marketplace. The Product & Content Policy team works together with individuals worldwide and has one goal in mind: to give people a voice in a safe environment.

The policy team has developed a series of community guidelines that stipulate the content and behaviour that Facebook allows and prohibits. These policies are based on community feedback, as well as expert advice in the fields of technology, public safety, and human rights. In order to ensure that everyone’s opinions are taken seriously, the platform has carefully formulated policies designed to accommodate different views and beliefs, especially those of users and communities who may be ignored or marginalised.

The community rules apply to all users and all types of content around the world. These codes have a wide range of applications. For example, even if it is not considered hate speech, some content will be removed if it violates other policies. Each word’s meaning varies by region, language or background, and its impact on people is also different. The platform has been working hard to resolve these nuances while also implementing policies consistently and fairly for all users and the content they post. The consequences of violating the Facebook community rules depend on the severity of the violation and the violation record of the offender on the platform. For example, for users who violate the laws for the first time, the platform may warn them, but if they continue to violate the rules, the media platform may restrict posting to Facebook or disable their personal homepage.

I consider this policy is successful. Self-governance circumvents the drawbacks brought about by freedom of speech to a certain extent, and any user who uses Facebook will benefit from it. It will deter bad online speech while purifying the network environment. We can clearly observe that the implementation of the community guidelines has greatly reduced illegal speech on the Internet, which also proves the policy's effectiveness. However, due to the enormous amount of online content, it is difficult for the platform to monitor every user. Therefore, mutual supervision between users can also be considered. For example, users can directly report suspected content violations, including public homepages, personal homepages, individual content, and comments.

--

--