BIas Obliteration in Near-future Information and Computing Algorithms (BIONICA) - A design fiction toolkit

by Alessio Malizia working at University of Hertfordshire / School of Creative Arts responding to Algorithmic Social Justice

I am interested in exploring

Investigate how to create design fiction scenarios that elicit questions about potential biases fostering balance and fairness. Particular biases of interest include labelling bias (how cases are classified by humans in the data collection phase), confirmation bias, anchoring and in particular algorithmic bias (when an algorithm has inherent racial, gender or socio-demographic bias due to class imbalance or inadequate labelling in the dataset by biased humans) and automation bias (when humans complacently accept computerised advice even when it is not rational and is categorically wrong with detrimental consequences).

My motivation is

There is evidence that algorithmic biases are usually discovered only after the release of a system or a machine into the market. For instance, a machine learning (ML) algorithm was used in the past to predict a criminal’s risk of recidivism, which in turn was used by judges to determine the length of penalties. Only after that this algorithm was used, it was proven that the training set was actually poorly designed, making the whole prediction affected by an evident racial bias. The aim of this project is thus to provide tools that can inform the design of new generation of machine learning applications, with the aim of discovering possible biases during (not after) the design process. To this end, we aim at exploiting design fictions, by defining a set of toolkits to make such methodology more accessible and usable by creators of ML algorithms: https://tinyurl.com/sejreea

Project focus

I want to focus on:

  • Recognition

Collaboration

I am looking for a partner in these sectors:

  • Public Sector
  • Industry / SMEs
  • Third Sector

I am looking to work in these areas:

  • Education
  • Finance & Labour market (including banking, gig economy)
  • Charity
  • Artificial Intelligence and/or Machine Learning
  • Web design and digital innovation

Sandpit events

I am going to: London March, 6th 2020

Contact

a.malizia@herts.ac.uk

Catalyst facilitates collaborations between Not-Equal Network+ members, who wish to develop and submit a research project in response to Not-Equal’s second Call for Collaborative Proposals. Search for initial project ideas and get in touch with potential collaborators.