"False"
Skip to content
printicon
Main menu hidden.
Published: 2023-05-04 Updated: 2023-05-05, 23:48

AI needs a queer perspective for an inclusive future

NEWS For several years now, Artificial Intelligence (AI) has been able to recognise faces through machine learning. At the same time, the risks associated with this type of visibility and the normative nature of the data that AI trains on, are emphasised. In an article, researchers at Umeå University argue that a queer perspective on AI can help identify the risks of facial recognition and provide a direction for a more inclusive technology.

Text: Sandra Lundström

– AI technologies such as image detection, facial recognition and classification constitute a particularly relevant area where we can see a clear intertwining between human identity and algorithmic computation. Facial recognition technologies rely on binary divisions and categorisations. But reality is more complex than that, and there are many people today who do not want to, or are unable, to identify within such divisions. Queer theory is about examining how norms for gender and sexualities are created and maintained in society and what consequences this may have - questions that are important to ask when it comes to AI’s rapid progress and implementation in society, says Evelina Liliequist, postdoc at the Center for Regional Science at Umeå University and affiliated researcher at Humlab.

Facial recognition is widely used today

Facial recognition is a technology that involves, for example, recording, analysing and comparing a person's facial shape, features and movements. This type of technology is widely used today.

Like most of technology, facial recognition can be used to make our lives more convenient but can also be used in ways that we may not approve of as a society.

– We can unlock our phone by just showing our face, or our photo app can automatically tag the people that appear in our photos by recognising faces. It is even used in e-passport gates, where your face is compared to the biometric data in your passport. Like most of technology, facial recognition can be used to make our lives more convenient but can also be used in ways that we may not approve of as a society. For example, there are also uses in security and surveillance which have been controversial and banned in some places, says Andrea Aler Tubella, Senior research engineer at Department of Computing Science and affiliated to Humlab.

Builds on cultural norms

To teach AI to recognise faces, systems need access to data (faces), which in turn are categorised in a binary fashion: male/female, child/adult, human/animal. Much of the data that is categorised is culturally determined in that, for example, a person with a beard is usually assumed to be male. One problem is that the data that AI trains on lack broader representation.

It's about the future we want to see, and the interdisciplinary meeting of humanities and technical subjects is a strength that should be widely used.

– With current technology, there is a risk of cementing essentialist understandings of sexuality and gender identities as natural and readable on the body, without considering people's self-identification and the level of openness a person wants, dares, or can express, says Evelina Liliequist.

The article Beyond the Binary - Queering AI for an Inclusive Future is written by researchers from different fields: ethnology, computer science, informatics and digital humanities.

– It's about the future we want to see, and the interdisciplinary meeting of humanities and technical subjects is a strength that should be widely used.

Read the article

Beyond the Binary - Queering AI for an Inclusive Future

The article is based on a chapter in the upcoming book Handbook of Critical Studies of Artificial Intelligence, which will be published in November 2023.

Authors of Beyond the Binary - Queering AI for an Inclusive Future