[Previous message][Next message][Back to index]
[Commlist] CFP - Foreseeing Race: The Technology and Culture of Risk Prediction after the Datalogical Turn
Fri Mar 06 03:02:54 GMT 2020
*Foreseeing Race: The Technology and Culture of Risk Prediction after
the Datalogical Turn *
Special issue, Journal of American Studies
<https://www.cambridge.org/core/journals/journal-of-american-studies>
Editors:
Georgiana Banita (University of Bamberg)
R. Joshua Scannell (The New School)
The current crisis of legitimacy in Big Tech and AI provides an opening
for forceful scholarly criticism of predictive and surveillant
technologies’ impacts on social life. The erosion of the supposed
trustworthiness and beneficence of the industry as a whole, and the Big
Four (Apple, Google, Facebook, Amazon) in particular, has precipitated a
discursive disillusionment with tech. And, while this has raised
questions of technological racism in certain segments, the broader role
that race plays in driving the bad actions of the tech industry remains
underreported and undertheorized. Perhaps more importantly, there is a
real opportunity here to imagine forms of struggle and liberation that
/may/ arise through tech, but certainly will not do so by themselves. In
other words, by focusing on racialized technology in an American
context, we hope to intervene in assumptions that the answer to these
forms of racism is merely “better” technology, or instead depends on a
rethinking of the terms of the discussion.
To name one controversial example: Big Data surveillance and risk
prediction have recently gained prominence within public safety and law
enforcement circles. Algorithmic decision-aids, which encode
criminological theories about crime rates and process vast amounts of
data into patterns and trends, have applications that range across
several fields from sentencing and the anticipation of recidivism to
predictive policing. Even though it remains unclear to what extent
predictive crime analytics systems absorb human biases, the technology
is already widespread in the US, UK, and continental Europe. Recent work
in sociology, political science, and computational social sciences as
well as high-profile human rights investigations have launched a
much-needed critical narrative by demonstrating that predictive
algorithms undermine equal justice and preclude individualized
assessments. While literary and cultural studies have been slower to
recognize the amplifying effect of unfair algorithms on existing
disparities of race, their methods of diachronic analysis and increasing
societal engagement have much to offer in answering key questions. How
did predictive technology gain social acceptance? And which historical
strategies of racial oppression inform contemporary methods of data
mining and crime forecasting?
In her celebrated examination of data-based inequality, Virginia Eubanks
argues that the world is enveloped in an inscrutable network of
“informational sentinels” that frame and allocate the distribution of
resources and privation outside of human agency or oversight. More
specifically with regard to race, new forms of racial profiling that
Safiya Noble has termed “technological redlining” consolidate existing
racial biases, inequalities, and the maldistribution of life chances
while formally appearing to operate “race blind.” In focusing on digital
red flags attached to people of color, these and other sociological
studies have uncovered the significant role of data discrimination in
the perpetuation of racism during the 21st century. Recent scholarship
by Ruha Benjamin about what she designates “the New Jim Code” has
additionally proven the benefits of tracking present algorithmic tools
to past techniques of promoting racial discrimination and White supremacy.
This assemblage of techniques and technologies is designed to measure
and manage human capacities, predict tendencies, and control bodies in
space and time without ever conjuring a “human” referent as such.
Instead, an array of systems generates and sifts data black boxed from
human comprehension and abstracted from any recognizable set of human
properties, yet output consequences of vulnerability and premature death
to the same groups of people that are historically and systemically
marginalized. Patricia Clough et al have called this mode of
governmentality the “datalogical,” a sort of technical read/write
program for racism-by-proxy that writes racial formations even as it
reads disembodied and decontextualized “data.”
Given the long durée and infrastructural nature of racial bias in the
US, literary and cultural methods are well-positioned to engage with the
continuities of racial imaginaries and thereby reveal the occluded
premises of racialized prediction in the present. Furthermore, querying
the purported advantages of state and infrastate surveillance and
prediction systems can revitalize cultural, ethical, and aesthetic
debates on the uses of narrative and visual media in making sense of
this disquieting morphological relation between individuals and
institutions of social control. Can the toolkit of cultural studies
reveal a clearer genealogy of digitally-driven surveillance and crime
analytics? And might perhaps the collating, speculative nature of
cultural interpretation itself not be entirely dissimilar from the
practices of automated intelligence?
In navigating this terrain, our interdisciplinary special issue aligns
contemporary American and British scholarship in the area of race and
digital technologies with broader cultural theorizations of how race
works as a structuring agent in the history and imaginary of the United
States. The essays will share a commitment to political and social
critique while adopting cultural studies approaches to explore the
racialized dimension of Big Data surveillance; the racial inequities
embedded in prediction technologies; and the problematic invisibility of
algorithmic (in)justice. The special issue will thus provide a
US-focused cultural conceptualization of predictive technology within
the emergent fields of automation studies and the ethics of Big Data.
Potential topics include but are not limited to:
Law Enforcement
-the use of socio-politically derived Artificial Intelligence in law
enforcement
-epistemologies and temporalities of predictive policing
-alternative frameworks for rethinking the future of race and high-tech
policing
-aesthetic and narrative approaches to crime forecasting
-continuities between digitally-driven racialized surveillance and
“analog” methods (e.g., Terry Stops, broken windows policing, etc.)
Surveillance
-militarized surveillance practices in formally non-conflict
environments (especially drone surveillance)
-countermeasures to racialized Big Data surveillance (including cloaking
and the development of alternative “black data”)
-statehood, governmentality, and the surveillance of race
-relationship of digital surveillance and prediction to broader
extractive practices of racial capitalism
Cultural Practice
-literary, visual, and quotidian responses to surveillance and
technologies of prediction
-historical precedents and roots of current algo-racism
-history and cultural production around the concept of prediction
-racialized prediction in the context of Afrofuturism, technology, and
the posthuman
We are asking for abstracts of around 500 words by March 31. Article
drafts of 8,000 to 12,000 words will be due by October 31, no article
processing charge. Please email abstracts and queries to both editors
at: (Georgiana.Banita /at/ uni-bamberg.de)
<mailto:(Georgiana.Banita /at/ uni-bamberg.de)>and (ScannellRJ /at/ newschool.edu)
<mailto:(ScannellRJ /at/ newschool.edu)>.
---------------
The COMMLIST
---------------
This mailing list is a free service offered by Nico Carpentier. Please use it responsibly and wisely.
--
To subscribe or unsubscribe, please visit http://commlist.org/
--
Before sending a posting request, please always read the guidelines at http://commlist.org/
--
To contact the mailing list manager:
Email: (nico.carpentier /at/ vub.ac.be)
URL: http://nicocarpentier.net
---------------
[Previous message][Next message][Back to index]