[Previous message][Next message][Back to index]
[Commlist] CfA: Theorizing Platform Content Moderation: Power, Resistance, and Democratic Control
Mon May 08 10:27:29 GMT 2023
CfA: Theorizing Platform Content Moderation: Power, Resistance, and
Democratic Control
-Conference: 2023 MANCEPT (Manchester Centre for Political Theory)
Workshops (https://sites.manchester.ac.uk/mancept/mancept-workshops/
<https://sites.manchester.ac.uk/mancept/mancept-workshops/>).
-Deadline for submission of abstracts (300 words): 11thof June (any time
zone).
-Dates/location of the panel: 11th-13thof September 2023. The panel will
take place in Manchester (UK) but remote participation is also possible.
Platform content moderation has emerged as a novel form of mass speech
governance, able to influence billions of people globally. Much of the
growing scholarship on it focuses on describing moderation’s
functioning, ambiguities, and technologies, and on how to hold platforms
to constitutional values. Yet, despite its obvious political nature,
content moderation remains under-theorized as a political practice.
This is puzzling as moderation rearticulates key concepts of political
theory. Regardless of their unilateral ability to moderate, platforms
often seek to appease some actors in the design and enforcement of their
rules. These processes are hardly linear, though: not all voices, from
all countries, at all times, are equally heard. While moderation has
been used against authoritarian actors, it has also been shown to
reinforce racist, sexist, and neo-colonial structures, often to foster
companies’ political and economic interests globally. This evidences the
need to understand how moderation relates to representation,
recognition, and plurality, which are closely associated with matters of
justice, equality, and dignity. Similarly, it is unclear what resistance
to these systems’ patterns of in- and exclusion might (and ought to) mean.
Two factors make platform content moderation challenging to address
through usual normative frameworks, such as legal rights. Firstly,
platforms are a peculiar kind of organization: globally operating
corporations whose immense power is not anchored in processes of
political legitimation (e.g., elections) or even a clear polity. Despite
the legality of their moderation practices, which are often protected by
so-called ‘safe harbour’ laws, these companies still owe us something
morally – but what, exactly, and which ‘us’ is this? Further, much of
moderation today is automated, commonly through machine learning
algorithmic systems. As a consequence, the meaning of “objectionable” or
“desirable”, or how to punish those who violate these definitions, may
emerge not from direct human reasoning but from probabilistic
calculations based on complexly constructed datasets. Whose voice is
represented and silenced when thousands of data annotators, moderators,
officers, and technologists play some role in the construction of the
algorithms that identify and control, say, hate speech? How to account
for the cascading layers of rules, institutions, and actors?
Workshop aims: This workshop aims to address the urgent task of
theorizing platform content moderation. We especially invite scholars
working from the perspective of radical democratic theory, democratic
resistance, decolonial theory, and political economy to consider three
broad questions:
(1)How should we conceptualize content moderation as a form of power,
and in which ways does it differ from previous forms of speech control?
(2)What does proper resistance to moderation mean, and how can it tackle
the multiple dynamics of in- and exclusion? And
(3)To what extent, and how, should democratic control over content
moderation be organised?
How to apply: Send a 300-word abstract to (n.appelman /at/ uva.nl)
<mailto:(n.appelman /at/ uva.nl)>. Thedeadline is 11thof June (any time
zone).Full papers are not required.
When & where: Workshops will take place preferably in-person, between
11th-13thof September 2023, in Manchester (UK). Submissions to present
online will be considered.
Organizers:
-Naomi Appelman, IViR (Institute for Information Law), University of
Amsterdam ((n.appelman /at/ uva.nl) <mailto:(n.appelman /at/ uva.nl)>
-João C. Magalhães, Centre for Media and Journalism Studies, University
of Groningen((j.c.vieira.magalhaes /at/ rug.nl)
<mailto:(j.c.vieira.magalhaes /at/ rug.nl)>)
---------------
The COMMLIST
---------------
This mailing list is a free service offered by Nico Carpentier. Please use it responsibly and wisely.
--
To subscribe or unsubscribe, please visit http://commlist.org/
--
Before sending a posting request, please always read the guidelines at http://commlist.org/
--
To contact the mailing list manager:
Email: (nico.carpentier /at/ commlist.org)
URL: http://nicocarpentier.net
---------------
[Previous message][Next message][Back to index]