Unregulated facial recognition technology presents unique risks for the LGBTQ+ community – TechCrunch


Unregulated facial recognition technology presents unique risks for the LGBTQ+ community – TechCrunch

It seems consumers today are granted ever-dwindling opportunities to consider the safety and civil liberties implications of a new technology before it becomes widely adopted. Facial recognition technology is no exception. The well-documented potential for abuse and misuse of these tools built by giant and influential companies as well as government and law enforcement agencies should give serious pause to anyone who values their privacy – especially members of communities that have been historically marginalized and discriminated against.

The cavalier attitude toward unregulated surveillance tools demonstrated by some law enforcement and other local, state, and federal government entities seem to reinforce the notion that forfeiting your personal data and privacy for greater convenience, efficiency, and safety is a fair trade. For vulnerable communities this could not be further from the truth. Without proper oversight, facial recognition technology has the potential to exacerbate existing inequalities and make daily life challenging and dangerous for LGBTQ+ individuals.

Biometric data can provide a uniquely intimate picture of a person’s digital life. Skilled and persistent hackers seeking to exploit access to an individual’s messages on social media, financial records, or location data would view the information collected by facial recognition software as a particularly valuable and worthwhile target, especially as biometric data has become increasingly popular as a form of authentication.

Without proper privacy protections in place, data breaches that target facial recognition data may become far more likely. In the wrong hands, a person’s previously undisclosed sexual orientation or gender identity can become a tool for discrimination, harassment, or harm to their life or livelihood.

The risks to transgender, nonbinary, or gender non-conforming individuals is even more acute.  Most facial recognition algorithms are trained on data sets designed to sort individuals into two groups – often male or female. The extent of the misgendering problem was highlighted in a recent report that found that over the last three decades of facial recognition researchers used a binary construct of gender over 90 percent of the time and understood gender to be a solely physiological construct over 80 percent of the time.

Consider the challenge – not to mention emotional toil – for a transgender individual trying to catch a flight who is now subject to routine stops and additional security screening all because the facial recognition systems expected to be used in all airports by 2020 are not built to be able to reconcile their true gender identity with their government issued ID.

Members of the LGBTQ+ community cannot shoulder the burden of lax digital privacy standards without also assuming unnecessary risks to their safety online and offline. Our vibrant communities deserve comprehensive, national privacy protections to fully participate in society and live without the fear that their data – biometric or otherwise – will be used to further entrench existing bias and prejudice.

Our communities face the challenge of trying to protect themselves from rules that neither they, or the people implementing them, fully understand. Congress must act to ensure that current and future applications for facial recognition are built, deployed, and governed with necessary protections in mind.

This is why LGBT Tech signed on to a letter by the ACLU, along with over 60 other privacy, civil liberties, civil rights, and investor and faith groups to urge Congress to put in place a federal moratorium on face recognition for law enforcement and immigration enforcement purposes until Congress fully debates what, if any, uses should be permitted.

Given the substantial concerns, which representatives on both sides of the aisle recognized at a recent hearing, prompt action is necessary to protect people from harm.  We should not move forward with the deployment of this technology until and unless our rights can be fully safeguarded.


Source link