DescriptionDIGHUM Seminar Series
|Period||2 Feb 2021|
|Held at||TU Wien, Technische Universitat Wien, Inst Stat & Math Methods Econ, Res Grp Econ, Austria|
|Degree of Recognition||International|
Documents & Links
A substantial portion of contemporary public discourse takes place over online social media platforms, such as Facebook, YouTube and TikTok. Accordingly, these platforms form a core component of the digital public sphere, which, although subject to private ownership, constitute a digital infrastructural resource that is open to members of the public. The content moderation systems deployed by such platforms have the potential to influence and shape public discourse by mediating what members of the public are able to see, hear, and say online. Over time, these rules may have a norm-setting effect, shaping the conduct and expectations of users as to what constitutes “acceptable” discourse. Thus, the design and implementation of content moderation systems can have a powerful impact on the preservation of users’ freedom of expression. The emerging trend towards the deployment of algorithmic content moderation (ACM) systems gives rise to urgent concerns on the need to ensure that content moderation is regulated in a manner that safeguards and fosters robust public discourse. This lecture develops upon the research carried out within the framework of the Research Sprint on AI and Platform Governance (2020) organized by the HIIG, Berlin (for more information on the research project and its key findings see, Freedom of Expression in the Digital Public Sphere (graphite.page)). It explores how the proliferation of ACM poses increased risks for safeguarding the freedom of expression in the digital public sphere and proposes legal and regulatory strategies for ensuring greater public oversight and accountability in the design and implementation of content moderation systems by social media platforms.