[Previous message][Next message][Back to index]
[Commlist] Visual Cultures Spring Term Public Programme: 'Race and/as Technology,' February 27th-March 27th
Thu Feb 27 09:14:08 GMT 2025
*Visual Cultures Spring Public Programme: Race and/as Technology*
You are cordially invited to attend Goldsmiths’ Department of Visual 
Cultures’ Spring Term Public Programme, a series of lectures organised 
around the theme ‘Race and/as Technology.’ This series adopts a media 
studies and visual cultures approach to the theme and will be of 
interest to media scholars.
All lectures are free and open to all, with no need to register, and 
take place *Thursdays, 5pm – 7pm, Professor Stuart Hall Building, Room 
LG01, Goldsmiths, University of London*.
The series has been running for 4 weeks, with prior events including a 
screening of /Geomancer /by Lawrence Lek (2017) and lectures by Scott 
Wark (Goldsmiths), Aleena Chia (Goldsmiths), and Brett Zehner (Exeter).
It continues Thursdays from *February 27^th  – March 27^th  with 
*lectures by Eunsong Kim (Northeastern; presenting online), Ramon Amaro 
(Het Nieuwe Instituut, Rotterdam); Beryl Pong (Cambridge), Derica 
Shields (writer and editor), and Maya Indira Ganesh (Cambridge) and Thao 
Phan (Australian National University).
Please see below for a description of the series’ rationale and 
abstracts for upcoming lectures. For more information, please get in 
touch with the series organiser, *Scott Wark* ((s.wark /at/ gold.ac.uk) 
<mailto:(s.wark /at/ gold.ac.uk)>).
*Series Overview*
In 2013, the media theorist Wendy H. K Chung published an essay entitled 
‘Race and/as Technology, or, How to do Things with Race.’ The premise of 
this provocative essay was to (re)conceptualise race as something that 
is external to racialized subjects; something, as she puts it, that is 
/done/ to subjects and something that they can /do/.
Using Chun’s proposition as a guiding thread, Visual Cultures’ Spring 
2025 Public Programme interrogates the intersections, conflicts, 
confluences and divergences of race and technology when these two terms 
are thought /and/as/.
How are the impacts of technical ‘innovation’ and ‘disruption’ unequally 
experienced by racialized subjects? What new geographies of racialized 
inclusion/exclusion are drawn by such technologies? Do digital 
technologies offer opportunities for liberation from racial oppression, 
or increase racialised discrimination? How does race intersect with 
modes of capital accumulation aided by technological ‘progress? And 
finally – to push this logic to its limit – what happens to ‘race’ 
itself, as experience but also as concept, as identity markers such as 
race are increasingly dissolved into data and subject to AI techniques?
*February 27^th , 2025: *Found / Property: Museums, Race and the Making 
of Labor Divisions, Eunsong Kim [online]**
This talk will be informed by /The Politics of Collecting: Race & the 
Aestheticization of Property/ (Duke University Press 2024). This book 
materializes the histories of immaterialism by examining the rise of US 
museums, avant-garde forms, digitization, and neoliberal aesthetics, to 
consider how race and property become foundational to modern artistic 
institutions.
*Eunsong Kim* is an Associate Professor in the Department of English at 
Northeastern University. Her practice spans: poetics, literary studies, 
and translation. She is the author of /gospel of regicide/, published by 
Noemi Press in 2017, and with Sung Gi Kim she translated Kim Eon Hee’s 
poetic text /Have You Been Feeling Blue These Days?/ published in 2019. 
Her monograph, /The Politics of Collecting: Race & the Aestheticization 
of Property/ (Duke University Press 2024) materializes the histories of 
immaterialism by examining the rise of US museums, avant-garde forms, 
digitization, and neoliberal aesthetics, to consider how race and 
property become foundational to modern artistic institutions. She is the 
recipient of the Ford Foundation Fellowship, a grant from the Andy 
Warhol Art Writers Program, and Yale’s Poynter Fellowship. In 2021 she 
co-founded offshoot, an arts space for transnational activist conversations.
*March 6^th , 2025: *The ascent, the exploration, and a more general 
mythology on techno-elitism and engineering, Ramon Amaro
This talk ends by considering notions of the contemporary human 
condition, or more so what to do with and about the fanatic discipline 
of technical engineering, at the hands of which we find ourselves stuck 
within the problem of a techno-humanist society led by Silicon elites, 
and the general misunderstanding of the role of technique within the 
current neotenic development of the human species. That is to say, my 
exploration emerges out of the urgency to place a new frame of meaning 
onto the exchange between solutionism, engineering, and environment, as 
presently situated within a world given to us as phenomena to experience 
rather than dominate. I argue that an awareness of this mode of 
existence must be thought through at the level of the vital in 
understanding the association between the technical milieu and practices 
of socio-technical manipulation and control. Here, I question how 
technicism plays a central role in culture-specific individual psychic 
and collective realities, and how these realities seek to compensate for 
the alienating results of techno-social being. Our discussion, 
nonetheless, begins with the story of a prince, a slave ship, and a 
Portuguese book seller before moving through a historical view of the 
engineer as ceremonial being and catalyst of new world orders.
*Dr. Ramon Amaro* is Senior Researcher for Digital Culture and Lead 
Curator at -1 
<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fnieuweinstituut.nl%2Fen%2Fprojects%2Fminus-one&data=05%7C02%7CS.Wark%40gold.ac.uk%7Cfe2eb88429134cdd67b108dd49a9501f%7C0d431f3f20c1461c958a46b29d4e021b%7C0%7C0%7C638747713812416194%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=X3YI4AusCIjS6qd9vzxtSpJjMLduJnJo5CwdkGjF2wo%3D&reserved=0>, 
an experimental testing ground for new tools, methods and public uses of 
digital culture at Nieuwe Instituut, the national museum and institute 
for architecture, design and digital culture in The Netherlands. His 
writings, research and artistic practice emerge at the intersections of 
Black Study, digital culture, psychosocial study, and the critique of 
computational reason. Ramon holds a BSe in Mechanical Engineering, an MA 
in Sociology and a PhD in Philosophy of Technology. Before joining 
Nieuwe Instituut, Ramon worked as Lecturer in Art and Visual Cultures of 
the Global South at UCL , Lecturer in Visual Cultures at Goldsmiths, 
Engineering Program Manager at the American Society of Mechanical 
Engineers, and Quality Design Engineer at General Motors Corporation. 
His recent book, /The Black Technical Object: On Machine Learning and 
the Aspiration of Black Being/ contemplates the abstruse nature of 
programming and mathematics, and the deep incursion of racial hierarchy, 
to inspire alternative approaches to contemporary algorithmic practice.
**
**
*March 13^th , 2025: *Volumetric Mediations: Atmospheres of Crisis and 
Unbelonging in Humanitarian Drone Documentaries, Beryl Pong
This talk takes as its starting point the argument made during the first 
wave of critical drone studies, which primarily focused on drone 
warfare, that the drone is a ‘technology of racial distinction’ 
(Allinson, 2015). In this context, drones have been conceived of as 
atmospheric technologies that engage in ‘racialization from above’ 
(Feldman, 2011): that target the minoritized while rendering the 
Westphalian border fluid and contingent, engaging in ‘ordering without 
bordering’ (Agius, 2017). In this talk, I will extend and re-consider 
this argument for the realm of forced migration. Focusing on the use of 
drones at the borders, variously conceived, where migrants are delayed, 
contained, or rendered immobile, I ask how ‘civilian’ drone culture in 
the form of humanitarian drone documentaries makes visible the nexus of 
race, border, and atmosphere in new ways. In particular, I will explore 
how drone documentaries about forced migration involve ‘volumetric’ 
mediations that engage with three-dimensional space with complex heights 
and depths (Jackman and Squire, 2021). In doing so, they create spaces 
of vexed encounter and relationality between the seer and the seen, to 
present different atmospheres of migration: as crisis, from above, and 
as unbelonging, from below.
*Beryl Pong* is a UKRI Future Leaders Fellow at the Centre for the 
Future of Intelligence, University of Cambridge, where she directs the 
Centre for Drones and Culture. She is the author of /British Literature 
and Culture in Second World Wartime: For the Duration/ (Oxford 
University Press, 2020), and the co-editor of /Drone Aesthetics: War, 
Culture, Ecology/ (Open Humanities Press, 2024). Her writing on drones 
and culture have appeared or are forthcoming in /Big Data & Society/, 
/Collateral/,/ Cultural Politics, Journal of War and Culture/,//and 
elsewhere. Her immersive installation about drone warfare, /Beware Blue 
Skies/, is showing at the Imperial War Museum, London until March 2025.
**
*March 20^th , 2025: *Derica Shields [title TBC]
**
*March 27th, 2025: *Who's Represented/What's Unrepresentable: Two 
Lectures on AI and Race, by Maya Indira Ganesh and by Thao Phan 
(presenting a talk co-authored with Fabian Offert)
**
*Maya Indira Ganesh: ‘I became a Woman of Colour in 2013’: De-centering 
Whiteness for Savarna-ness in thinking about technology and power.*
This is a short and early conversation about re-configuring studies of 
AI and bias away from a positionality that centres Whiteness chiefly 
because it obscures the axes of power and discrimination that matter in 
the lived realities of the global majority. Indian Dalit, Bahujan, and 
Adivasi (DBA) scholars have already developed a body of work showing the 
intersections of caste power, discrimination, and technology. As their 
work, and that of new Dalit intellectuals argue, caste is not just 
another demographic marker to be ticked off in a bias mitigation 
toolkit. This is partly to do with how caste privilege and hierarchies 
present and are experienced. But a more compelling reason is that we are 
still in need of radical and speculative approaches that might flip the 
script by drawing attention to Savarna privilege and how power works 
rather than  how oppression is experienced.
*Thao Phan and Fabian offert: Are some things (still) unrepresentable?*
“Are some things unrepresentable?” asks a 2011 essay by Alexander 
Galloway. It responds to a similarly titled, earlier text by the 
philosopher Jacques Ranciére examining the impossibility of representing 
political violence, with the Shoa as its anchor point. How, or how much 
political violence, asks Ranciére, can be represented? What visual 
modes, asks Galloway, can be used to represent the unrepresentable? In 
this talk, we examine two contemporary artistic projects that deal with 
this problem of representation in the age of artificial intelligence.
Exhibit.ai, the first project, was conceived by the prominent Australian 
law firm Maurice Blackburn and focuses on the experiences of asylum 
seekers incarcerated in one of Australia’s infamous “offshore processing 
centers.” It attempts to bring ‘justice through synthesis’, to mitigate 
forms of political erasure by generating an artificial record using AI 
imagery. Calculating Empires: A Genealogy of Power and Technology, 
1500-2025, the second project, is a “large-scale research visualization 
exploring the historical and political dependence of AI on systems of 
exploitation in the form of a room-sized flow chart.
On the surface, the two projects could not be more unlike: the first 
using AI image generators to create photorealistic depictions of 
political violence as a form of nonhuman witnessing (Richardson), the 
second using more-or-less traditional forms of data visualization and 
information aesthetics to render visible the socio-technical 
‘underbelly’ of artificial intelligence. And yet, as we argue, both 
projects construct a highly questionable representational politics of 
artificial intelligence, where a tool which itself is unrepresentable 
for technical reasons becomes an engine of ethical and political 
representation. While images are today said to be “operational”, meaning 
that they no longer function as primarily indexical objects, AI images 
(arguably the most operational image) are now asked to do the 
representational (and profoundly political) work of exposing regimes of 
power, exploitation, and violence.
*Maya Indira Ganesh *is Associate Director (Research Culture and 
Partnerships), co-director of the Narratives and Justice Program, 
<https://www.lcfi.ac.uk/research/programme/ai-narratives-and-justice> and 
a Senior Research Fellow at the Leverhulme Centre for the Future of 
Intelligence at the University of Cambridge. From October 2021- July 
2024 she was an assistant teaching professor co-directing the MSt in AI 
Ethics and Society <https://www.lcfi.ac.uk/education/mst/> at the 
university. Maya has degrees in Psychology, Media and Cultural Studies, 
and a DPhil in Cultural Studies. Her doctoral work took the case of the 
‘ethics of autonomous driving’ to study the implications of ethical 
decision-making and governance by algorithmic/AI technologies for human 
social relations. Her monograph based on this thesis, /Auto-Correct: The 
Fantasies and Failures of AI, Ethics, and the Driverless Ca/r, will be 
available on March 10, 2025 and can be pre-ordered here 
<https://artezpress.artez.nl/books/auto-correct/>. Maya’s most recent 
project, with Louise Hickman and others, is AI in the Street 
<https://www.careful.industries/ai-in-the-street/overview>, a project 
about AI in public and AI’s marginalised and expert publics.**
*Thao Phan* is a feminist science and technology studies (STS) 
researcher who specialises in the study of gender and race in 
algorithmic culture. She is a Lecturer in Sociology (STS) at the 
Research School for Social Sciences at the Australian National 
University (ANU). Thao has published on topics including whiteness and 
the aesthetics of AI, big-data-driven techniques of racial 
classification, and the commercial capture of AI ethics research. She is 
the co-editor of the volumes An Anthropogenic Table of Elements 
(University of Toronto Press) and Economies of Virtue: The Circulation 
of 'Ethics' in AI (Institute of Network Cultures), and her writing 
appears in journals such as Big Data & Society, Catalyst: Feminism, 
Theory, Technosocience, Science as Culture, and Cultural Studies. She is 
a member of the Australian Academy of Science’s National Committee for 
the History and Philosophy of Science and is the co-founder of 
AusSTS—Australia’s largest network of STS scholars.
---------------
The COMMLIST
---------------
This mailing list is a free service offered by Nico Carpentier. Please use it responsibly and wisely.
--
To subscribe or unsubscribe, please visit http://commlist.org/
--
Before sending a posting request, please always read the guidelines at http://commlist.org/
--
To contact the mailing list manager:
Email: (nico.carpentier /at/ commlist.org)
URL: http://nicocarpentier.net
---------------
[Previous message][Next message][Back to index]