In May, San Francisco became the first city in the world to ban the municipal government's use of face surveillance technology. Days later, Somerville, Mass., became the first city on the East Coast to do the same. Now -- thanks to a movement led by dozens of civil rights organizations nationwide -- municipalities and states across the country are debating the government's use of a technology that poses unprecedented threats to our civil rights and civil liberties.
Face recognition systems use computer algorithms paired with databases to analyze and classify images of human faces in order to identify or track people. The technology is currently entirely unregulated in the United States, but police departments and other government agencies are nonetheless using it -- too often in secret. But it's not like what you've seen on cop shows like CSI; face recognition doesn't always work. And the inaccuracies are particularly damaging for certain groups of people, namely Black women and trans and nonbinary people.
A study conducted by Massachusetts Institute of Technology researcher Joy Buolamwini, for example, found significant racial and gender bias in facial recognition algorithms. Buolamwini found that face recognition algorithms can misclassify Black women's faces nearly 35 percent of the time. These same algorithms almost always get it right when classifying white men's faces.
But moreover, this technology too often fails to take into account transgender and nonbinary people. Studies show face recognition products sold by Amazon, IBM, and Microsoft consistently misclassify people in our community.
A critical shortcoming of this technology is that it has been programmed to read people as either male or female -- a technological assertion that the gender binary is immovable, fixed, and here to stay. Even within the confines of this rigid binary, the tech has an extremely retrograde view of what "male" and "female" look like. For example, systems can be programmed to recognize short hair as a "male" trait or makeup as a "female" characteristic. These outcomes reflect choices made by computer programmers about which images they will use to train algorithms as well as how those training data are classified.
A recent study revealed that face recognition technology can only accurately guess the genders of cisgender women and cisgender men. On average, the study showed the technology gets it right for cisgender women 98.3 percent of the time, and cisgender men 97.6 percent of the time. But the same system consistently performed worse on transgender individuals and universally was unable to classify nonbinary genders. The effects of this are already proving to be harmful. For example, a report in 2018 stated that some transgender individuals who drove for Uber had their accounts suspended because the company uses face recognition software as a built-in security feature, and the software was unable to recognize the faces of individuals who were transitioning.
That's why the Massachusetts Transgender Political Coalition is an active part of a coalition calling for the Massachusetts state legislature to press pause on government use of face surveillance. The need to bring these technologies under democratic control is particularly urgent for my community: trans people of color.
Face surveillance technology mistakes can pose serious problems for anyone who does not conform to the traditional aesthetic of binary gender options, and for trans folks. When governments use these systems, trans and nonbinary people are at risk of being misgendered or even rendered invisible. That's the last thing we need for communities that already face serious stigma and high rates of homelessness and incarceration -- and for Black trans women, the highest murder rate in the country.
Transgender and nonbinary members of the LGBTQ+ community already face significantly increased risks of depression, anxiety, victimization, and decreased gender affirmation and social support from family members and others. A number of studies, including one led by Boston University School of Public Health and Harvard Medical School, show there are devastating disparities in mental health issues and suicidal ideation in transgender and nonbinary students. Social denial of gender identity, cyberbullying, social isolation, and discrimination can all contribute to these issues. LGBTQ youth who are coming to terms with their own identities and gender expression are often deeply affected by these factors and are constantly bombarded by social systems and infrastructure reinforcing that how they identify as a person is "incorrect."
Face surveillance technology's rigid adherence to traditional gender norms threatens to exacerbate these existing health risks and erase the 21st-century existence of anyone who does not conform. On a systemic level, the use of these technologies underscores rather than challenges the notion that transgender and gender-nonconforming persons are not part of "normal" society.
We cannot allow technology to be used in areas like government services, policing, or border control if it excludes and alienates or otherwise ignores an entire class of people. Unfortunately, absent regulation, that's exactly what's happening.
Today, face recognition systems consistently fail to identify transgender and nonbinary people, meaning millions of human beings are at risk of being rendered invisible in our increasingly digital world. But we won't return to 1950s, whether by the Trump administration trying to reverse antidiscrimination protections or through an automated enforcement of outdated gender norms.
We must press pause on the government's use of face surveillance technology -- in the Bay State and beyond. Winning this fight and getting this right are critical to the welfare of transgender, gender-nonconforming, nonbinary, and agender members of the LGBTQ community.
Tre'Andre Valentine is the executive director of the Massachusetts Transgender Political Coalition.