Facebook’s policies against extremism: Ten years of struggle for more transparency

For years, social media, including Facebook, have been criticized for lacking transparency in their community standards, especially in terms of extremist content. Yet, moderation is not an easy task, especially when extreme-right actors use content strategies that shift the Overton window (i.e., the range of ideas acceptable in public discourse) rightward. In a self-proclaimed search of more transparency, Facebook created its Transparency Center in May 2021. It also has regularly updated its community standards, and Facebook Oversight Board has reviewed these standards based on concrete cases, published since January 2021. In this paper, we highlight how some longstanding issues regarding Facebook’s lack of transparency still remain unaddressed in Facebook’s 2021 community standards, mainly in terms of the visual ‘representation’ of and endorsement from dangerous organizations and individuals. Furthermore, we also reveal how the Board’s no-access to Facebook’s in-house rules exemplifies how the longstanding discrepancy between the public and the confidential levels of Facebook policies remains a current issue that might turn the Board’s work into a mere PR effort. In seeming to take as many steps toward shielding some information as it has toward exposing others to the sunshine, Facebook’s efforts might turn out to be transparency theater.

x
Tags: Content Regulation, Extremist Content, Facebook, Social Media, transparency