Police use of facial recognition tools destroys privacy
By AdvocateDaily.com Staff
Presser, principal of Presser Barristers, writes in the online legal publication that the first concern is the fact that such tools are not very accurate.
She says the most precise facial recognition tool in 2017 — Chinese technology called Tencent YouTu Lab — only accurately identified faces in a testing challenge 83.29 per cent of the time. That means that across a large sample, the most accurate facial recognition tool would be wrong about one in six times.
“Unlike identifications based on DNA, which sits on a scientifically tested and validated statistical platform, we do not know the variation across the human population of various facial characteristics,” Presser writes in The Lawyer’s Daily opinion piece.
“Making identifications based on facial characteristics has not been proven using scientific methods. Facial recognition is more art than science.”
The lack of accuracy will inevitably lead to the risk of wrongful arrests, detentions, and potential wrongful convictions, she says, noting that the technology has been used by Toronto Police for more than a year.
“Given some of the privacy-destruction and other concerns arising from this technology, the answer is not to improve its accuracy or get better science about the incidence of facial characteristics,” Presser says.
“As privacy scholar professor Woodrow Hartzog has noted, facial recognition is harmful when it is inaccurate, and incredibly oppressive the more accurate it gets. The answer is to ban the use of this technology in law enforcement.”
She says another concern with facial recognition tools is bias.
The darker one’s skin,
the more one is likely to be misidentified by facial recognition tools, she tells The Lawyer’s Daily.
“This means that use by law enforcement of facial recognition tools is likely to result in reinforcing over-representation of non-white individuals, incorrectly, in the justice system,” Presser says.
She says equally concerning is the fact that the use of such tools in policing depends on the state maintaining massive databases of biometric data about private citizens, including images of their faces, connected to names and identities.
Presser says the state should not be maintaining such databases linked to identities — at least not for law enforcement purposes.
She says the Supreme Court has recently recognized that people can have a reasonable expectation of privacy in their own image.
“The state should not be able to collect and store our faces to potentially then use them against us,” Presser says.
The fourth and perhaps most troubling problem surrounding the use of facial recognition tools by law enforcement is that they facilitate the constant surveillance of private citizens by the state, she says.
“Facial recognition implies an Orwellian level of surveillance. This would be destructive of any kind of concept of privacy as anonymity in public places,” Presser says.
Anonymity is recognized by the Supreme Court as attracting constitutional protection under s. 8 of the Charter, she says.
Anonymity is an extremely important kind of privacy in a democracy, Presser says.
In a 2019 New York Times opinion piece, professors Hartzog and Evan Selinger refer to it as obscurity — the kind of privacy people enjoy in public places, and they explain that it is essential for freedom and democracy to flourish, she adds.
Public anonymity is a precursor to freedoms and rights, Presser says. We only exercise those rights — freedom of association, freedom of expression, freedom of religion — when we are free from surveillance, oppression or persecution, she says.
“Facial recognition tools in the hands of law enforcement threaten to destroy the anonymity or obscurity that is essential to the enjoyment of fundamental rights and freedoms. They threaten the very fabric of civil society and democracy itself,” Presser writes in The Lawyer’s Daily. “We must reclaim our right to be just another face in the crowd.”