TY - GEN
T1 - Freedom of expression in the digital public sphere
T2 - Strategies for bridging information and accountability gaps in algorithmic content moderation
AU - Mendis, Sunimal
AU - Cowls, Josh
AU - Darius, Philipp
AU - Golunova, Valentina
AU - Prem, Erich
AU - Santistevan, Dominiquo
AU - Wang, Wayne Wei
PY - 2020/12/3
Y1 - 2020/12/3
N2 - A substantial portion of contemporary public discourse and social interaction is conducted over online social media platforms, such as Facebook, YouTube, Reddit, and TikTok. Accordingly, these platforms form a core component of the digital public sphere which, although subject to private ownership, constitute a digital infrastructural resource that is open to members of the public. As private entities, platforms can set their own rules for participation, in the form of terms of service, community standards, and other guidelines. The content moderation systems deployed by such platforms to ensure that content posted on the platform complies with these terms, conditions, and standards have the potential to influence and shape public discourse by mediating what members of the public are able to see, hear, and say online. Over time, these rules may have a norm-setting effect, shaping the conduct and expectations of users about what acceptable discourse looks like. Thus, the design and implementation of content moderation systems have a powerful impact on the freedom of expression of users and their access to dialogic interaction on the platform. With great power comes great responsibility: the increasing trend towards the adoption of algorithmic content moderation systems that have a questionable track record as regards their ability to safeguard freedom of expression gives rise to urgent concerns on the need to ensure that content moderation is regulated in a manner that safeguards and fosters robust public discourse in the online sphere.
AB - A substantial portion of contemporary public discourse and social interaction is conducted over online social media platforms, such as Facebook, YouTube, Reddit, and TikTok. Accordingly, these platforms form a core component of the digital public sphere which, although subject to private ownership, constitute a digital infrastructural resource that is open to members of the public. As private entities, platforms can set their own rules for participation, in the form of terms of service, community standards, and other guidelines. The content moderation systems deployed by such platforms to ensure that content posted on the platform complies with these terms, conditions, and standards have the potential to influence and shape public discourse by mediating what members of the public are able to see, hear, and say online. Over time, these rules may have a norm-setting effect, shaping the conduct and expectations of users about what acceptable discourse looks like. Thus, the design and implementation of content moderation systems have a powerful impact on the freedom of expression of users and their access to dialogic interaction on the platform. With great power comes great responsibility: the increasing trend towards the adoption of algorithmic content moderation systems that have a questionable track record as regards their ability to safeguard freedom of expression gives rise to urgent concerns on the need to ensure that content moderation is regulated in a manner that safeguards and fosters robust public discourse in the online sphere.
U2 - 10.5281/zenodo.4292408
DO - 10.5281/zenodo.4292408
M3 - Other contribution
PB - Alexander von Humboldt Institute for Internet and Society
ER -