http://unbias.wp.horizon.ac.uk/2017/07/28/a-call-for-use-case-examples/

As part of our work to contribute to the development of the IEEE P7003 Standard for Algorithm Bias Considerations we are reaching out to the community of stakeholders to ask for use cases highlighting real-world instances of unjustified and/or inappropriate bias in algorithmic decisions.

The purpose of the P7003 Standard is to establish a methodological framework that will help designers and regulators of algorithmic systems to conduct best practises for minimizing unjustified and inappropriate bias. This aim will only truly be achieved if the P7003 standard is recognized by prospective users, i.e. developers, service providers and regulators, as being useful for them because it addresses real world issues they are facing. We are therefore seeking to collect use cases that highlight these issues.

Cases we are most interested in are not necessarily grand headline catching cases, but rather any case that has clear implications for identifiable industry/service sectors (e.g. a use case related to failures of face recognition software to detect faces of People of Colour has implications for services like Automated Passport Control that use image recognition algorithms).

To help communicate the nature and properties of the use cases we encourage the use of the following template.

Example use case

Title of use case: Beauty contest judging algorithm biased to favour white participants

Contributor of the use case (name, affiliation, contact): Ansgar Koene, University of Nottingham, [email protected]

Area:  others (entertainment)

Discrimination: race

Status of the use case: completed

Relevant stakeholders: end-users, private sector organizations (e.g. advertisers)

Problem description: An attempt to provide an objective (culturally neutral, racially neutral) judgement of female beauty by using algorithms, trained on crowd-sourced data, to judge the beauty of participants. Roughly 6,000 people from more than 100 countries participated by submitting photos. Out of 44 winners, nearly all were white, a handful were Asian, and only one had dark skin.

Why it matters (from P7003’s viewpoint): The concept that the use of algorithms can transform judgement about inherently subjective matters (in this case human beauty) into objective (culturally and racially neutral) judgements is deeply flawed. The data required to establish the judgement criteria was probably culturally and racially biased.

Action taken: The Beauty.AI contest appears to have been discontinued

Relevant regulations and/or standards (in force, coming): None

Further information (URLs, etc.):

http://beauty.ai/

https://www.theguardian.com/technology/2016/sep/08/artificial-intelligence-beauty-contest-doesnt-like-black-people

More Osman Ali Sadek Ibrahim's questions See All
Similar questions and discussions