U.K. Watchdog Calling Facial Recognition Technology 'Inaccurate'
Marie Donlon | May 15, 2018Source: BBCThough gaining in worldwide usage, the accuracy of facial recognition technology is being challenged by U.K. privacy watchdog Big Brother Watch, according to recent reports.
After submitting freedom of information requests to each police force in the U.K., Big Brother Watch determined that some departments have incorrectly identified a number of suspects. One police department reported inaccurately matching over 100 possible suspects to its police database of mugshots, while another made 2,685 matches during a period of one year, with 2,451 of those matches eventually declared incorrect.
Used at a number of public events such as carnivals, festivals and parades, high-definition cameras are used to match faces in these crowds to images such as mugshots held in a police database.
Responding to the reports that the technology isn’t entirely accurate, the Metropolitan Police said that it was trialing facial recognition technology to determine if it was reliable enough to "assist police in identifying known offenders in large events, in order to protect the wider public."
"Regarding 'false' positive matches — we do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts," Metropolitan Police said in a statement. "All alerts against the watch list are deleted after 30 days. Faces in the video stream that do not generate an alert are deleted immediately."
Still, Big Brother Watch is apprehensive about the use of the technology and believes it might impact "individuals' right to a private life and freedom of expression."
"Automated facial recognition technology is currently used by U.K. police forces without a clear legal basis, oversight or governmental strategy," the group said.