OPINION: Why AI could be dangerous for LGBT+ people

by Yuri Guaiana | All Out
Tuesday, 13 April 2021 11:07 GMT

FILE PHOTO: A new CCTV camera for facial recognition technology is pictured at the train station Suedkreuz in Berlin, Germany, August 1, 2017. REUTERS/Hannibal Hanschke

Image Caption and Rights Information

* Any views expressed in this opinion piece are those of the author and not of Thomson Reuters Foundation.

Imagine not being able to access a public toilet because an algorithm thinks you don't look male or female enough or being tracked in countries where same-sex love is still a crime

Yuri Guaiana is senior campaigns manager at All Out

A growing number of governments and companies use artificial intelligence (AI) systems to identify citizens and consumers online or in public spaces such as airports, event venues and shops.

But this technology is fundamentally flawed when it comes to recognising and categorising human beings in all their diversity.

Take automated recognition of gender and sexual orientation, for example. This is a technology that tries to identify your both factors by analysing your name, how you look and sound and how you move. In reality, it can never recognise gender or sexuality but only make a prediction.

The benefits of this technology are dubious, but its implications are significant for everyone, especially marginalised groups, including LGBT+ people.  

With governments, police forces and corporations worldwide increasingly using this technology, the LGBT+ community risks getting overly monitored, unfairly targeted, discriminated against, having their spaces invaded and being denied access to certain spaces. 

The algorithms are also programmed to reinforce outdated stereotypes about race and gender that are harmful to everyone on as well as how LGBT+ people “look”, reviving the old mistaken belief that you can predict character from appearance.

Imagine being screened out of a girls-only social app or denied public transport discount on Women’s Day because you don’t look “female” enough for some software. Or imagine being denied access to certain adverts at your metro station because the algorithm regularly misgenders you and gives you no opportunity to opt out of the technology.

These are potentially very real examples, but unless we do something to stop it, the reality for many people could be even more sinister.

Imagine the humiliation of trying to catch your flight and being stopped by a computer that determines you don’t match the gender marker in your passport. Or imagine not being able to access a public toilet just because an algorithm thinks your face doesn’t look male or female enough.

This might sound unlikely in more liberal countries, but who can tell how current thinking on LGBT+ rights might go in the future?

If this wasn't scary enough, state-deployed AI surveillance technology could be even more dangerous for LGBT+ communities in countries where same-sex love is still a crime.

“Facial recognition technology could be deployed at scale by malicious actors to rapidly identify people sharing their pictures online, whether publicly or through direct messages. [It] could similarly be used to automatically identify people in captured recordings of protests, in queer nightclubs or community spaces, and other in-person social events,” wrote Nenad Tomasev et al. in a paper titled, 'Fairness for Unobserved Characteristics: Insights from Technological Impacts on Queer Communities'.

Technology should improve people’s lives, not put them at risk. And what is the redeeming value of automated recognition of gender and sexual orientation? Aren’t interests more important than gender or sexual orientation in influencing consumers’ behaviours?

The good news is that we can do something about this. This month, the European Commission, the executive branch of the European Union, plans to propose a legal framework – that is, a set of rules and guidelines – to regulate AI systems. This is a unique opportunity to ban automated recognition of gender and sexual orientation in the EU and prevent such tools from being exported around the world.

That’s why All Out, a global LGBT+ rights organisation, in partnership with Access Now and Reclaim Your Face, has launched a campaign calling on the EU to ban automated recognition of gender and sexual orientation. 

This is a critical step towards ensuring this technology will not be developed, deployed or used by public or private entities across the globe.

RELATED STORIES

Trans travellers face 'invasive' airport security at Thanksgiving

Dropout 'mad scientist' uses AI to tackle bias

Filmmaker uses face swaps to make Chechnya ‘gay purge’ movie

Update cookies preferences