It’s everywhere: at restaurants, airports, concert venues, even beer festivals and soon it will be at the 2020 Olympics.

Facial recognition has seemingly taken over as the technology du jour.

Yet, for something so seemingly pervasive, the technology has been criticized for a variety of flaws. These criticisms are forcing some of its biggest customers — government and law enforcement — to reconsider its use.

Entrusted to unlock phones, access buildings and locate wanted suspects, something so vital should work and yet it reportedly does not. While many government agencies and companies have been quick to adopt the technology despite its imperfections, it has faced criticism for being inaccurate and biased. In particular, the technology is largely considered ineffective when used on people of color because facial recognition software is largely trained using the faces of white men.

The good

Facial recognition technology has been credited with capturing wanted criminals and stalkers at Taylor Swift concerts. Likewise, it is being implemented in U.K. and Chinese prisons to help combat drug smuggling and gang violence as well as to thwart escape attempts. It has even helped to identify and subsequently nab imposters attempting to board planes using another person’s passport.

The bad

However, examples of the inaccuracies and biases inherent in facial recognition technology are everywhere. One notable example of bias is Amazon’s Rekognition facial recognition software, which reportedly does not recognize the faces of people of color.

The ACLU. To demonstrate the flaws inherent in facial recognition technology, the American Civil Liberties Union (ACLU) compared images of U.S. Congress members against a database of public mug shots using Amazon’s Rekognition software. According to the ACLU, 28 members of the U.S. Congress were falsely identified as criminal suspects, a finding that Amazon had at the time refuted. However, the ACLU claimed that the results merely highlighted the failings of facial recognition technology.

"Our test reinforces that face surveillance is not safe for government use," said Jacob Snow, the ACLU's technology and civil liberties lawyer.

"Face surveillance will be used to power discriminatory surveillance and policing that targets communities of colour, immigrants and activists. Once unleashed, that damage can't be undone," continued Snow.

Particularly concerning to the ACLU was the number of people incorrectly identified, almost 40% of whom were African American, prompting the Congressional Black Caucus to speak to the "profound negative unintended consequences" facial recognition systems could have for African-American people.

"Congress should press for a federal moratorium on the use of face surveillance until its harms, particularly to vulnerable communities, are fully considered," said the ACLU's legislative counsel Neema Singh Guliani.

Jaywalking failure. Following a trial of artificial intelligence (AI) and facial recognition designed to shame jaywalkers in Ningbo, China, the technology proved flawed when it misidentified a woman pictured in an advertisement on the side of a city bus as a jaywalker.

Failure at the MTA. A 2018 trial of facial recognition technology conducted by the Metropolitan Transportation Authority (MTA) in New York City failed, according to a report from the Wall Street Journal. Conducted at the Robert F. Kennedy Bridge (formerly, the Triborough Bridge), which links Manhattan, Queens and the Bronx, the trial resulted in the positive identification of no drivers, according to a recently-acquired MTA email.

Experts suggest that the reason for the failure was the speed that drivers cross the bridge, making identification nearly impossible.

A 3D-printed head. Another attempt to demonstrate the flaws built into facial recognition technology was done to prove how the technology, when used to secure devices such as phones and tablets, can be breached. To do this, a reporter from Forbes had a 3D model of his head printed, which he then used to successfully disable the built-in facial recognition tech used to secure four different types of Android devices.

U.K. watchdog. U.K. privacy watchdog Big Brother Watch called the accuracy of facial recognition tech into question when it was discovered that some of the U.K. police departments using facial recognition tech had misidentified a number of criminal suspects. The group suggests that such mistakes happened an estimated nine times out of 10.

The Ugly

Although there is currently a bipartisan bill in Congress seeking limits on facial recognition technology and attention to the issue has been heightened thanks to pushback from privacy advocates, there is almost no regulation of the technology, prompting criticism from a number of sources.

The letter. Recently, 90 advocacy and activist groups wrote open letters addressed to tech giants Google, Amazon and Microsoft imploring them to not sell their respective facial recognition technologies to government authorities. The letters which were signed by a coalition that includes the American Civil Liberties Union, the Electronic Frontier Foundation, Human Rights Watch and the Refugee and Immigrant Center for Education and Legal Services, warn that facial recognition technology enables the government and law enforcement to unfairly target specific individuals including immigrants, people of color and religious minorities.

Nicole Ozer, technology and civil liberties director for the ACLU of California explained: “History has clearly taught us that the government will exploit technologies like face surveillance to target communities of color, religious minorities, and immigrants. We are at a crossroads with face surveillance, and the choices made by these companies now will determine whether the next generation will have to fear being tracked by the government for attending a protest, going to their place of worship, or simply living their lives.”

A possible ban. Similarly, a lawmaker in San Francisco has presented legislation that would make the city the first in the U.S. to ban the use of facial recognition technology.

As part of an initiative called the Stop Secret Surveillance Ordinance, Aaron Peskin, a member of San Francisco’s Board of Supervisors, proposed an outright ban on city agencies, including law enforcement, using facial recognition technology. The ban is just one part of an overall proposal that would require the city’s agencies to seek the board’s permission before purchasing and using surveillance technology of any kind. The goal of the initiative, according to Peskin, is to bring greater oversight to surveillance tech use in the city.

“I have yet to be persuaded that there is any beneficial use of this technology that outweighs the potential for government actors to use it for coercive and oppressive ends,” Peskin said.

Considering that many are using the technology undeterred, examples of potential abuses are sure to follow as evidenced in Brooklyn, New York, where a landlord is preparing to install facial recognition tech at the entrance of a 700-unit, rent-stabilized apartment complex. Calling the attempt far-reaching and egregious, tenants and housing rights attorneys are expressing concern. According to critics, the technology not only invades privacy, it is also a civil liberties violation. The population of the facilities, largely elderly African American women, are likely to be most impacted by the technology, which prompted the residents to write that the technology "disproportionately impacts the elderly, women and people of color." They also noted that the landlord had “made no assurances to protect the data from being accessed by NYPD, ICE, or any other city, state, or federal agency.”

Similarly, homeowners will soon be using the technology to surveil anyone who approaches their front door thanks to Ring’s line of video-audio doorbells that uses facial recognition technology to spot suspicious characters in a neighborhood, alerting local law enforcement of their presence.

Such uses make regulation and oversight of facial recognition technology all the more urgent.

To contact the author of this article, email mdonlon@globalspec.com