Melian Solly wrote an article about how ImageNet Roulette, an AI tool created by Trevor Paglen, is biased when it comes to how it categorizes people. When a white person uploads a picture of themselves it talks about an occupation or a characteristic. However when other races upload their picture it stereotypes them. This is probably due to the data that was used while creating the program. There was most likely more data sampling of white people, when it comes to other races the creator’s bias might have impacted the data or they might not have created a diverse data set. This shows how important it is to pick good and diverse data samples and to keep in mind the wide range of people that would be using the program. Even though a programmer might be designing for themselves when creating the app, at some point in the process it is important to step back and consider the users as well. The data is just showing a summary of what developers put into categorizing people, and because of that the developers biases leak into the program.