[Previous message][Next message][Back to index]
[ecrea] AI, gender and health and social care - PhD Studentship at University of Brighton
Sat Oct 13 08:43:52 GMT 2018
ALGORITHMIC GENDER BIAS IN HEALTH AND SOCIAL CARE PhD STUDENTSHIP
Economic and Social Research Council (ESRC) ARTIFICIAL INTELLIGENCE 2018
South Coast Doctoral Training Partnership
Applications are invited for an interdisciplinary four-year PhD
studentship at the research Centre for Digital Media Cultures and the
Centre for Applied Data Analytics at the University of Brighton. We
invite candidates with a social sciences or computer sciences background, or
both. The successful candidate will join an internationally recognised
team of researchers to investigate the ways in which algorithmic gender
bias occurs, and understand how it can be mitigated against, in the
specific context of Health and Social Care (H&SC) provision. You will a)
map algorithm-based systems currently in use by H&SC providers, b) develop
methods that identify gender bias within AI systems, and c) examine ways
in which bias can be removed. This PhD studentship will provide an
outstanding training in social research methods and computer science data
analytics, to produce insights into algorithmic bias, and to develop
thorough understanding of the social, economic, cultural, ethical and
governance aspects of Artificial Intelligence for future societies.
The successful candidate will be supervised by Prof Flis Henwood (School
of Applied Social Sciences), Dr Aristea Fotopoulou (School of Media), and
Prof Anya Belz (School of Computing, Engineering and Mathematics).
The University of Brighton has internationally recognised strengths in AI
research in social science, arts and humanities and computer science
contexts, spanning research fields such as data analytics, natural
language processing, human-computer interaction and systems security,
computer graphics, dialogue and narrative generation, and novel user
experiences including Augmented/ Virtual/Mixed Reality. The student will
be formally located in the School of Applied Social Sciences but have
access to outstanding cross-disciplinary research environments through
participation in both the Digital Media Cultures and
Applied Data Analytics research centres, enabling them to develop the
necessary skills and competences in social framing of technologies,
computational technologies, economic imperatives, cultural effects, ethics
and governance of Artificial Intelligence for future societies.
Eligibiliy
We fund students who will undertake Masters + PhD programmes (1+3
funding) and stand-alone PhD programmes (+3 funding).
For the Masters + PhD route, you are expected to have a good honours
degree at first or upper second-class level, from a UK academic higher
education institution or international equivalent, in a relevant social
sciences discipline (e.g., anthropology, geography, media and cultural
studies, politics, psychology, science and technology studies,
sociology) or in computer science, data science, mathematics, or statistics.
For the stand-alone PhD programme (+3 PhD award) you would have achieved a
level of research training that would allow them to proceed directly to
PhD. This is
usually through the attainment of a previous masters qualification in the
social sciences with a Distinction grade.
For more information about how to apply please see:
https://www.brighton.ac.uk/research-and-enterprise/postgraduate-research-degrees/funding-opportunities-and-studentships/2018-algorithmic-gender-bias-in-health-and-social-care.aspx
Deadline for applications: Sunday 4 November 2018, 11.59pm
---------------
The COMMLIST
---------------
This mailing list is a free service offered by Nico Carpentier. Please
use it responsibly and wisely.
--
To subscribe or unsubscribe, please visit http://commlist.org/
--
Before sending a posting request, please always read the guidelines at
http://commlist.org/
--
To contact the mailing list manager:
Email: (nico.carpentier /at/ vub.ac.be)
URL: http://nicocarpentier.net
---------------
[Previous message][Next message][Back to index]