Hazard: Ranks Or Classifies People#

A red diamond shaped outline (like a warning sign) with a set of weighing scales in the middle, and a person sitting on each side of the scale.


Ranking and classifications of people are hazards in their own right and should be handled with care.

To see why, we can think about what happens when the ranking/classification is inaccurate, when people disagree with how they are ranked/classified, as well as who the ranking/classification is and is not working for, how it can be gamed, and what it is used to justify or explain.


Example 1: Facial recognition categorising human images by sexual orientation.

Example 2: School league tables (which rank the perfmance of schools).

Safety Precautions#

  • Test the effect of the algorithm or technology for different marginalised groups.

  • Carefully consider the validity of any classification groups and work with subject specialists from the application area on this.

  • Be transparent about what the weaknesses of the algorithm and technology are: test how can it be fooled.

  • Consider alternatives to ranking/classification, for example treating people equally, increasing resources for the issue at hand, or allowing people to self-select.

  • Consider the implication of sampling bias in the data used as inputs to classification models, which may result in overrepresentation of particular demographic groups or might mean there is limited training data for particular classes.