Archive for calls, September 2023

[Previous message][Next message][Back to index]

[Commlist] Cfp: Content Moderation on Digital Platforms: Beyond States and Firms

Tue Sep 05 12:16:33 GMT 2023





The Internet Policy Review has published a call for papers for a
special issue on "Content Moderation on Digital Platforms: Beyond
States and Firms".

Guest Editors of the special issue
Romain Badouard & Anne Bellon

Please find here and below more information :
https://policyreview.info/node/1717
Abstraction submission deadline : October 31, 2023

++++

Call for papers: Content moderation on digital platforms: beyond
states and firms

Special issue of the Internet Policy Review

Content regulation on digital platforms has attracted growing interest
from scientists and regulators in recent years. In Europe, new legal
initiatives at the national and European level, such as the NetzDG in
Germany, the Law on fake news in France or the Digital Services Act,
transform the power relation between public authorities and private
platforms. At the academic level, these developments are seen
primarily through the lens of state-platform relationship, focusing in
particular on the role of nation states in Internet governance.

However, content regulation on digital platforms goes far beyond this
dual relationship. It encompasses a wide diversity of actors who
develop their own practices of content regulation, apart from, in
partnership with, or against public authorities and firms. Journalists
build up fact-checking procedures to limit the spread of fake news.
Activists put pressure on advertisers in order to cut sources of
funding for hate groups. Marketing agencies monetize blacklists of
sites, groups and channels to avoid reputation backlash while
advertising online. Social media users put in practice counter-speech
strategies in order to stem hate speech or circumvent restrictions on
social media. Even researchers develop their own transparency and
accountability indicators to assess platform policies related to
content regulation. Besides, these actors are gradually associated
with the evolution of content regulation through formal partnerships
and multi-stakeholder organizations.

All these actors, and many others, who could be grouped together under
the banner of "civil society", are now actively contributing to
content moderation and regulation on digital platforms. However, their
precise role, their strategies and means of action remain little
studied. Following recent works published in Internet Policy Review,
such as calls for studying content regulation issues beyond the scope
of relationships between states and platforms (Gillespie et al.,
2020), this special issue intends to illustrate the role of “civil
society” in the global governance of speech online, and map the
variety of social groups involved in content moderation. Drawing on
existing work on multistakeholderism that have discussed the
participation of civil society in the global internet governance
(Massit-Folléa, 2014; Raymond, DeNardis, 2015), this special issue
aims at pushing forward the understanding of non-state actors’
contribution in the current internet regulatory framework. It calls
for empirical studies that provide an insight on the social conditions
and materiality of civil society’s contribution to content regulation.

Scope of the special issue

For this special issue we invite contributions that analyze the role
of users, non-profit organizations, media, technical and professional
groups, researchers and other actors from civil society in moderating
and regulating contents on digital platforms, while potentially
discussing their articulation with public authorities and platform
initiatives. Three axes of contribution are envisioned.

First of all, this special issue aims at moving beyond the
state-platform scope of analysis by gathering papers that describe the
complex configurations of actors involved in the moderation of
contents. Contributors are invited to study the various social and
institutional arrangements informing moderation policies and notably
the extent of civil society’s involvement in this process – from the
orientation of moderation policy to the detection of illegal content.
Such contributions could also enlighten the role of users’ protests
and mobilizations in shifting the scope and focus of moderation
(Gillespie, 2018; Myers-West, 2017), notably how sexual minority and
race-based groups try to voice their concern regarding the specific
discrimination and censorship they face online (Nakamura, 2013).

Secondly, we would like to identify papers that contribute to the
study of technologies and design in offering new forms of regulation.
Although many works have focused on algorithmic content detection by
the platforms (Yeung, 2018; Gorwa, Binns, Katzenbach, 2020), in
articulation with their business of data collection, we would like to
attract papers that discuss alternative design and standards developed
by tech communities to promote more diverse and distributed types of
moderation. For example, platforms such as Mastodon emphasize the role
of users in defining alternative moderation rules and mechanisms. Such
a discussion may also include new strategies developed by activists to
document algorithmic censorship and inform shadowbanning practices.

Thirdly, we would like to include in this special issue papers that
study the circulation and distribution of practices and knowledge
across civil society organizations. Global fora on content moderation
are key loci to observe exchanges between organizations as well as
conflicting interpretation of moderation principles. Far from a
homogenous ensemble, civil society is a broad concept that gathers
various groups with conflicting interest and unequally distributed
resources. Such papers could contribute to a global mapping of power
relations within civil society as well as shed new light on the
“Brussel effect” (Bradford, 2020) from third parties perspectives.
More broadly, it would offer a critical assessment of “civil society”
as a regulatory entity and the way it is endorsed or instrumentalized
by governments and platforms.

Special issue editors

Romain Badouard, Associate Professor, University Paris Panthéon-Assas

Anne Bellon, Associate Professor, University of Technology of Compiègne

Important dates

750-1000 words abstracts should be sent to (romain.badouard /at/ u-paris2.fr)
and (anne.bellon /at/ utc.fr) by 31 October 2023. The abstracts should
delineate the research question to be discussed, the case study to be
analysed and an indication of the expected findings or conclusions.
Decisions will be sent to the authors by 30 November 2023.
Full papers of the selected abstracts should be submitted by 17 March
2024. Submissions must be around 6,000 words in length and have to
follow the submission guidelines of the Internet Policy Review. They
will be peer-reviewed between March and July.
The planned publication date of this special issue is Q4 2024.
No payment from the authors will be required.


---------------
The COMMLIST
---------------
This mailing list is a free service offered by Nico Carpentier. Please use it responsibly and wisely.
--
To subscribe or unsubscribe, please visit http://commlist.org/
--
Before sending a posting request, please always read the guidelines at http://commlist.org/
--
To contact the mailing list manager:
Email: (nico.carpentier /at/ commlist.org)
URL: http://nicocarpentier.net
---------------




[Previous message][Next message][Back to index]