# Hazard: Ranks Or Classifies People ```{image} ../../images/hazards/classifies-people.png :alt: A red diamond shaped outline (like a warning sign) with a set of weighing scales in the middle, and a person sitting on each side of the scale. :width: 250px ``` ## Description Ranking and classifications of people are hazards in their own right and should be handled with care. To see why, we can think about what happens when the ranking/classification is inaccurate, when people disagree with how they are ranked/classified, as well as who the ranking/classification is and is not working for, how it can be gamed, and what it is used to justify or explain. ## Examples __Example 1:__ [Facial recognition categorising human images by sexual orientation](https://www.bbc.co.uk/news/technology-41188560). __Example 2:__ [School league tables](https://www.bristol.ac.uk/media-library/sites/cmm/migrated/documents/limitations-of-league-tables.pdf) (which rank the perfmance of schools). ## Safety Precautions - Test the effect of the algorithm or technology for different marginalised groups. - Carefully consider the validity of any classification groups and work with subject specialists from the application area on this. - Be transparent about what the weaknesses of the algorithm and technology are: test how can it be fooled. - Consider alternatives to ranking/classification, for example treating people equally, increasing resources for the issue at hand, or allowing people to self-select. - Consider the implication of sampling bias in the data used as inputs to classification models, which may result in overrepresentation of particular demographic groups or might mean there is limited training data for particular classes. ## Related Resources [Book] [Sorting Things Out: Classification and its consequences by Bowker and Star (1999)](https://mitpress.mit.edu/books/sorting-things-out)