Gender Shades, Credit: MIT Media Lab

Gender Shades

Joy Buolamwini, Timnit Gebru

Joy Buolamwini and Timnit Gebru investigated into the bias of AI facial recognition programs.

The study reveals that popular applications that are already part of the programming display obvious discrimination on the basis of gender or skin color. A further reason for the unfair results can be found in erroneous or incomplete data sets on which the program is being trained. In things like medical applications, this can be a problem: simple convolutional neural nets are already as capable of detecting melanoma (malignant skin changes) as experts are. However, skin color information is crucial to this process. That’s why both of the researchers created a new benchmark data set, which means new criteria for comparison. It contains the data of 1,270 parliamentarians from three African and three European countries. Thus Buolamwini and Gebru have created the first training data set that contains all skin color types, while at the same time being able to test facial recognition of gender.

By starting the content, you agree that data will be transmitted to www.youtube.com.
Data Protection Declaration

Credits: Joy Buolamwini, Founder of the Algorithmic Justice League and Poet of Code