Machines, Artificial Intelligence and rising global transphobia

How does facial recognition technology and surveillance capitalism put trans people in danger?

By Jean Linis-Dinco
PhD in Digital Media and Communications Studies, 
CAIDE Summer Research Academy

In recent years, transgender people were finally able to step out from the periphery of the LGBT movement and start filling the airwaves. Yet, the increased visibility of transgender issues comes in a package  with an increase in securitisation, surveillance and violence. In this article, I elaborate on how new  technologies have exacerbated the securitisation, surveillance and violence of trans bodies worldwide.

In just one glance, facial recognition technology can now classify the gender of many people. However, it is not without its problems. According to the new research from the University of Colorado Boulder, the most popular facial recognition technologies in the market are guilty of inaccurately classifying the gender of trans and non-gender-conforming individuals. There is no argument that misgendering is disrespectful, but above all, it perpetuates the system of oppression that relegates those who do not fit into gender binaries to a subclass of human existence. And for members of the trans community who have been historically maligned and marginalised, pushing them back into the shadows invalidates their personhood. Lack of disaggregated data means that inequalities faced by transgender individuals will remain indiscernible, and this could have a tremendous effect on decision-making processes that aid in the realisation, protection and fulfilment of their human rights. Artificial intelligence technologies are now the new gatekeepers who get to decide whose lives matter more, and as these technologies are not without its biases, it has now become the latest battlefield of class struggle.

Trans rights

The ‘surveillance capitalism’ brought by the neoliberal agenda needs an ‘other’ to flourish, and who else is a better prey than transgender people who are ‘culturally constructed as concealing something’? (p.209) Transgender people are often stopped and searched at airports mainly because their travel documents are often incongruent to how they present themselves. Sometimes, they might get lucky and find an airport officer who knows what is going on. But, automation of these processes rarely gives space to those who do not fit in the binary cookie-cutter. Thus, trans people are automatically branded as fraudulent. AI is garbage in, garbage out, and if we feed it with training data that devalues trans people, we multiply that bigotry and turn it into an institution. And this does not happen by chance because it is most often the powerful and the rich who have the monopoly on how technology can be deployed to maintain the status quo.

One researcher thought that the answer to this problem is to grab YouTube videos of transgender individuals before, during and after transition and use those videos as training data for their facial recognition software. But the people in the video were never asked if they would like their content to be included in the dataset, which leads us into the more significant debate on the ethics of data collection online. On top of that, the research questions are quite problematic and bring us closer to my previous point on the securitisation of trans bodies. The research, which was partly funded by the FBI and the US Army, looks to out transgender individuals under the notion that they could be security threats. It comes from the idea that terrorists can take hormones to increase their chances of crossing international borders. This claim has clearly  disregarded decades-long research and advocacy that being trans is not a choice.

This research, as well as the Clearview software which makes it possible for strangers to easily discover your name and address through a photo taken as you walk down the street, puts trans people in danger--especially those who may not be out of the closet. Moyer reported that Clearview is now being used by hundreds of law enforcement agencies in the US. And it is only a matter of when and not if this new technology comes to the hands of regimes where being trans is punishable by extensive jail time, corporal punishment and even death. For the longest time, blending has helped trans people live in a society where they are treated the same way as cis people, far from the public eye's hypercritical stares. For those unfamiliar with this term, blending is when a transgender person is perceived as cisgender instead of the sex they were assigned at birth. In countries where being trans is a crime, 'going stealth' is the only way to survive. Three-hundred fifty transgender people were killed in 2020 alone. The majority of the murders happened in Central and South America, far from the distorted and glamourised Hollywood representation of trans-ness.  For instance, Human Rights Watch Researcher Kyle Night reported that a group of trans women were stalked, detained and chastised for ‘bad morals’ by militant Islamist vigilantes and the police in Indonesia in 2017. The Indonesian example is just one of the many cases that involves deliberate violence against transgender individuals. Now, imagine if that scenario can be applied globally through automated technologies by people who refuse to acknowledge and who demonise transgender people in broad daylight with impunity.

AI technology has undoubtedly made our lives easier through self-driving cars, music recommendations and home security. Like every other technological disruption we have had in the past, from the printing press to the advent of social media, there are winners and losers, victors and vanquished, and masters and enslaved. Historically, it is the powerless who have often found themselves at the bottom of the barrel: queer and transgender people, people of colour and indigenous populations. As we move forward, we have to ask questions  and investigate who benefits and who gets harmed from the advent of this new technology. The rise of the Internet in the 1990s was hailed as a great equaliser, but as we have seen in recent years, there has been nothing equal about it. Let us keep an eye on new technologies and ensure they will not follow the same path. As our society is getting more and more focused on individualism and the ‘self’, let us not allow ourselves to become desensitised to the underprivilege’s needs and sufferings. Technology, after all, is a double-edged sword and it’s about which side of the sword we use.

-----

Jean Linis-Dinco is a PhD student at the University of Melbourne. She is interested in using data science tools in predicting and analysing the Rohingya crisis in Myanmar. Jean’s past research has looked into HIV/AIDS-related reports in the Philippines, media sentiments towards LGBTIQ communities in Nepal, and the impact of securitisation on marginalised communities in Indonesia. She has been featured by the Human Rights Campaign in Washington DC as a Global Innovator for her film on transgender rights in the Philippines. Jean has written articles for SHAPE-SEA in Thailand, IFAIR in Germany and the Global Campus of Human Rights in Italy. Jean has worked as a Public Information Consultant for the United Nations Office of the High Commissioner for Human Rights.

More Information

Jean Linis-Dinco

jcldinco@jcldinco.com