Women in Media Arts: Does AI think like a (white) man?

gender shades_2000x1000_juergengruenwald, Gender Shades, Joy Buolamwini (US), Timnit Gebru (ETH), photo: Jürgen Grünwald

Through various initiatives, Ars Electronica attempts to increase the visibility of women working in media art in the public consciousness and to focus on role models for girls and women. This motivation has given rise to Women in Media Arts – a comprehensive database devoted specifically to women in media art. The database began with information on female protagonists who have left a mark on Ars Electronica’s 36-year history and was then expanded to include a public input module.

Users are invited to contribute articles on women artists. The entries offer female artists the opportunity to present themselves on the platform, even if they have not appeared in the context of Ars Electronica. The database serves as an active research platform for artists, curators and scholars and for anyone interested in the topic of Women in Media Art.

Women in Media Arts is continually updated and expanded. The platform is accessible via the Ars Electronica Online Archive.

Impression from Women in Media Arts, photo: Tom Mesic

Do intelligent machines think like a white man?

Artificial intelligence is becoming an increasingly important topic in the diversity discourse – with good reason! More and more activists point out the problematic prejudices and distortions of supposedly objective algorithms. This has not only to do with the low proportion of female programmers and women in the IT sector in general, but above all with the biased data sets. “The AI is only as good or as fair as the data it feeds” is considered the highest premise.

The following artists and activists have dealt with the technical developments and the resulting social consequences. They have trasformed the insights they have gained into projects that we would like to present to you here briefly:

Joy Buolamwini (US), Timnit Gebru (ETH): Gender Shades

Joy Buolamwini and Timnit Gebru investigated the prejudices of AI face recognition programs in “Gender Shades”. Their error rate is significantly higher among women, especially among women with darker skin colour.

The study shows that popular applications show a clear discrimination regarding the gender or skin colour of people already in the programming. A further reason for discriminatory results is the incorrect or incomplete data sets used to train the programs.

This can be a problem in medical applications: Even simple convolutional neural networks – artificial neural networks – are capable of detecting melanomas (malignant skin changes) on images just as well as experts. Information about skin colour is extremely important in this context.

The two researchers have therefore created a new benchmark data set, i.e. new standards of comparison. It contains the data of 1,270 parliamentarians from three African and three European countries. Buolamwini and Gebru have thus created the first training data set that includes all skin colour types and can simultaneously test face-based gender recognition.

Mary Flanagan (US): [help me know the truth]

The fact that a discriminatory algorithm does not emerge from a sexist or racist nature of the machine, but from the structures of our society, becomes clear once again in Mary Flanagan’s Project “[help me know the truth]”.

Based on the findings of cognitive neuroscience and the results of visitor participation, [help me know the truth] creates the – in the truest sense of the word – perfect stereotype from a digital self-portrait.

Simple questions invite to choose between two slightly different portraits. The criteria used to differentiate range from “Choose the victim” to “Identify the leader”. The participants also decide, among other things, which face is the most angelic, the friendliest or the most criminal.

Caroline Sinders (US): Feminist Data Set

Caroline Sinders wants to actively counteract such bias. Feminist Data Set is an ongoing, multi-year art project that combines lectures and workshops and calls for the collection of feminist data, which should create interventions in the field of machine learning. What is feminist data? Feminist data can be art works, essays, interviews and books on feminism or from a feminist perspective. This feminist data set is intended to counteract bias in machine learning and to introduce the possibility of data collection as a feminist practice.

Caroline Sinders has also won the current AI Lab Residency together with Anna Ridler and they will work on their project on “AI isn’t Artificial but Human” at the Edinburgh Futures Institute (EFI) in Edinburgh and the Ars Electronica Futurelab in Linz.

Feminist Data Set, Caroline Sinders, photo: Rachel Steinberg

Birgitte Aga (UK), Coral Manton (UK): Women Reclaiming AI

The artists Brigitte Aga and Coral Manton show another example of an activist approach to artificial intelligence. They refer in “Women Reclaiming AI” to the ubiquitous depiction of women by AI language assistants as gender-specific, subordinate and servant. These systems are usually developed by teams that lack diversity and are thus embedded in distorted world views and stereotypes that reinforce traditional gender roles. With the WRAI project, the artists aim to reclaim female voices in the development of future AI systems by empowering women to use dialogic AI as a medium for protest. Gender roles are challenged by the creation of their own language assistance.

Further insights into the (lack of) female perspective in artificial intelligence can be found in the lecture by the two artists at last year’s Ars Electronica Festival: “Talk about inclusive AI”

, , , ,