ImageNet Roulette is a classification tool created by artist Trevor Paglen and AI researcher Kate Crawford as a provocation design to help us realise the way in which humans are classified in machine learning systems. This tool regularly returns racist, misogynistic, and cruel results which is because of the dataset it derives from ImageNet’s ‘Person’ category, which is an offset of face recognition experiments.
The ImageNet dataset is largely used for object recognition such as finding out apples and oranges when an image is uploaded. One of the subsets of ImageNet is the person category which are classified and labelled in terms of race, gender, age, and character. These labels used to tech the AI were supplied by lab staff and crowdsourced workers who introduced their conscious and unconscious opinions and biases into the algorithm.
Shortly after the ImageNet Roulette went viral, the ImageNet team announced plans to remove over 600,000 images featured in its people category. On the 27th of September 2019, the ImageNet Roulette has been taken off the internet after proving its point on how things can go wrong and remains as a physical art installation at the Fondazione Prada Osservertario in Milan.