- What is Facial Recognition Technology?
Facial Recognition Technology (FRT) is a flawed but powerful technology that, when used by police, risks the misidentification of individuals as suspects for crimes, but also the creation of an enduring and long-term chilling effect on individuals’ ability to freely participate in public protest and move freely in publicly accessible places.
As a probability-based biometric technology, it attempts to identify a person by comparing a biometric template created from a face detected in an image or video, otherwise known as a ‘probe’ image, against a reference database of biometric templates of known or identified people.
Error rates will vary depending on the multiple factors which can affect the performance of an FRT system. These include, but are not limited to:
- The quality of images used, whether that is the ‘probe’ image or the images in the database used;
- The lighting of the images;
- The pixelation of the images;
- The pose of the person in the ‘probe’ or database images and whether or not parts of a person’s face are clear in the images;
- If parts of the person’s face are obstructed by a mask, scarf, hoodie, sunglasses, etc;
- The selected threshold setting for ‘similarity’; and
- How a person running an FRT search decides to choose a supposed ‘match’.
A threshold value is fixed to determine when the software will indicate that a probable match has occurred. Should this value be fixed too low or too high, respectively, it can create a high false positive rate (i.e. the percentage of incorrect matches identified by the technology) or a high false negative rate (i.e. the percentage of true matches that are not detected by the software). There is no single threshold setting which eliminates all errors.
- What do you mean when you say ‘how a person running an FRT search decides to choose a supposed match’?
An FRT search does not give police a single, positive identification of a person. Instead it gives police a list of potential candidates accompanied by similarity scores. The length of that list would largely depend on the size of the database against which an image is compared, and how many people in the database look like the person being searched.
Even then, there is no guarantee that a ‘true match’ will be at the top of the FRT search return list. Nor is there a guarantee that a police official will choose the correct ‘true-match’ from the list, if one even exists because the person whose identity is being sought may not actually feature in the database.
For example, Robert Williams was wrongfully arrested in front of his wife and children, detained and arraigned by Detroit police after they used FRT to try to identify a shoplifter who stole a watch. The image of Williams was the ninth most likely match for the probe photograph, which was obtained from CCTV of the incident. The analyst who ran the search did an assessment and decided Mr Williams’ image was the most similar to that of the suspect. Two other algorithms were also run. In one, which returned 243 results, Williams wasn’t even on the candidate list. In the other FRT search – of an F.B.I. image database – the probe photograph generated no results at all.
In other words, there are many factors with FRT which risk misidentification.
- Does the misidentification risk with FRT only concern people who are not white and male?
That is not the case. For example, police in Houston, Texas misidentified Harvey Eugene Murphy Jr., a 61-year-old white man, as an armed robbery suspect – even though he was living in California at the time of the incident in Houston. Mr Murphy was held in jail for 10 days after his arrest in October 2023, during which he alleges he was gang-raped by three other men with whom he was detained.
However, it is the case that, while error rates will vary depending on multiple factors, some of which are outlined above, these errors do not affect all individuals equally. Scientific studies have clearly demonstrated deeply inherent racial and gender biases in FRTs due to, in part, how they have been trained, meaning women and people of colour are more likely to be misidentified, and therefore wrongly accused by police who use FRT, than light-skinned men. There are six documented cases whereby police misidentified six Black people, five men and one woman, in the US due to the use of faulty FRT. The woman, who was eight months’ pregnant at the time of her arrest and detention, was misidentified for car-jacking.
Widely reported testing from the federal agency in the US, the National Institute for Standards and Technology (NIST) testing in 2019 found FRT algorithms were up to 100 times more likely to misidentify Asian and African American people than white men, and that women and younger individuals were also subject to disparately high misidentification rates.
In addition, ensuring more diverse representation in training datasets will not eliminate the problem of demographic disparities in false-match rates. While other factors may also be at play, this is partly because the colour-contrast settings in digital cameras disproportionately result in underexposed images of darker-skinned people, which reduces FRT accuracy when attempting to process and attempting to match those images.
- But seven misidentifications is not a high figure?
This figure doesn’t truly demonstrate the real-life impact of FRT. There are seven known wrongful arrests in the US, and a wrongful arrest case in Argentina, but the fact is: it is unknown how many people wrongfully arrested and incarcerated may have taken plea deals in the US specifically, particularly for people who may have previous convictions and feel pressure to take a plea bargain in order to avoid a lengthy sentence for a crime they did not commit. Also due to the secrecy around the use of FRT by police, in the US at least, it is often not disclosed to a person that FRT was used against them.
- But if it’s just an ‘investigatory lead’ what’s the problem?
The use of FRT by police can cause huge disruption and humiliation for innocent people simply going about their lives. People are often wrongfully stopped and questioned on the streets of London by the Metropolitan Police using live FRT because of false matches.
In May of this year, Big Brother Watch launched legal action against police and shop use of FRT in the UK after two incidents: A teenager was wrongly flagged as a suspected shoplifter by FRT cameras while shopping. She was searched, publicly thrown out of the store, told by staff that she was a thief, and banned from shops across the UK which use FRT provided by Facewatch. In a separate case, Shaun Thompson, a community worker, was stopped by police using FRT; wrongly called a criminal, asked to give his fingerprints and held for 20 minutes. He was only let go after he provided a form of identification.
In December 2023, the US Federal Trade Commission banned pharmacy retail chain Rite Aid from using FRT for surveillance purposes for five years after finding that, from 2012 to 2020, Rite Aid used FRT to identify customers who may have been engaged in shoplifting or other problematic behaviour but actually led to thousands of false-positive ‘matches’, with Rite Aid’s actions disproportionately impacting people of colour.
- But didn’t the gardaí say that FRT is 99% accurate?
There are several issues with this figure.
The 99% ‘accuracy’ figure is based on a study from NIST involving comparing high-quality, clean images with perfect lighting, and where people’s faces were perfectly captured in images used for visa applications and mugshots against very similar clean, high-quality and controlled images. This does not reflect how FRT would be used in Ireland under the present proposal.
Remember, the proposed bill provides for gardaí to use any images of video they legally have, or can legally access, for an FRT search. This means the use of blurry images taken from poor-quality CCTV or images where a person’s face is obscured, where you may not see a person’s eyes, or cheek, or maybe somebody’s eyes are closed or mouth is open, or where a face is at an angle. So the issue is the figure represented by An Garda Síochána does not represent the real-life intended use of FRT in Ireland.
These kinds of issues have led to the police in New York using lookalike celebrity images to run a search, or taking things such as a set of open eyes or a mouth or a nose from Google Images and imposing that on a blurry CCTV still in order to run a FRT search in the hope of getting an ‘investigative’ lead.
But in addition, the discriminatory issue persists. Even within that 99% figure quoted by the gardaí, it still shows that error rates are 60 times worse for West African women than for EU men.
- But if FRT is so faulty and error-prone, why is ICCL concerned about mass surveillance?
FRT is deeply problematic when it fails, as above, and when it functions. This is because the twin dangers of highly consequential misidentifications and pervasive surveillance mean policing bodies should not be deploying FRT at all. However deeply discriminatory and defective FRT may be in respect of a given application, it is a technology which is likely to only become more sophisticated and, in turn, enable powerful mass surveillance by stripping people of their anonymity, reducing people to walking licence plates and tilting the power dynamic inherent in police-civilian interactions further into the hands of police.
This is a particular risk when FRT is used on live or recorded video which threatens to allow police to efficiently track one or many individuals across multiple video feeds, or to pull up every instance of one or more persons appearing in video recordings over time. This capability, which has already been used to devastating effect by some authoritarian governments, threatens to chill people’s fundamental rights to freedom of expression and protest. Members of the public, aware they are being watched, might alter their behaviour and self-censor. Such surveillance infringes on people’s fundamental right to privacy.
This technology threatens to give a government the unprecedented ability to instantaneously identify and track anyone as they go about their daily lives; such invasive tracking could easily reveal sensitive details about an individual’s political opinions, religious or philosophical beliefs, sex life or sexual orientation. It is for these reasons that courts in recent years have called policing FRT “novel and untested”, “highly intrusive”, and “controversial”.
It is because of these dangers and risks associated with police use of FRT that more than 20 jurisdictions in the US have banned the use of FRT by police, including Boston; Minneapolis; Pittsburgh; Jackson, Mississippi; San Francisco; King County, Washington; and the State of Vermont. *
- What does the Irish FRT legislation propose?
The Draft General Scheme of the Garda Síochána (Recording Devices) (Amendment) Bill provides for An Garda Síochána to apply FRT to any images or footage that the gardaí legally retains, or can legally access, in order to “identify”, “locate” or “follow the movements of a person” as a means to “progress an investigation” pertaining to certain offences, including some public order offences, or “a matter relating to the protection of the security of the State”.
There is a stark lack of safeguards and limitations on the use of FRT within the draft scheme, while there is no explanation as to the source of “biometric data which is legally held by An Garda Síochána” against which FRT searches would be run. The scheme essentially provides for gardaí to press “rewind” on a person’s movements without any requirement that there is an evidentiary link that the person being sought, identified and tracked has committed, or is even suspected of having committed, a crime. Crucially, it is proposed that such intrusive searches will be subject to internal Garda approval as opposed to judicial approval or approval from an independent authority.
It is ICCL’s position that this scheme, and the very broad purpose to “progress an investigation” gives excessive discretion to gardaí to identify and track the movements of people without limitation; in an untargeted fashion; without safeguards; without regard or due consideration for whether or not such identification, locating or tracking would take place at a protest or place of worship where other special category data could be processed; and without any requirement for objective and verifiable evidence that a person searched, or a person in a database searched against, has any link to the respective offence. The scheme also contains no definition of national security.
- When you say the scheme says gardaí can use any imagery or video it legally retains, or can legally access, what do you mean? And what biometric database would the gardaí use to run FRT searches against?
The short answer is we don’t know. The stipulation that gardaí can use any imagery of video material that it legally retains or can legally access gives the gardai enormous discretion. There is an important backdrop to this proposal: the Garda Síochána Recording Devices Act 2023 has already vastly expanded the ability of gardaí to record people through the use of body-worn cameras, drones, and access to CCTV.
As for the biometric database the gardaí would use to run searches against is also utterly unclear. It is not known what database they intend to use; the source of the database; how a database would be populated if they were to make their own; and the criteria for adding anyone to that database. This is hugely important because anyone in that database would be in a virtual line-up, potentially without just cause, any time a search is carried out, thus putting them at risk of misidentification. The important backdrop to this is that the State has unlawfully, in ICCL’s view, built a national biometric database of 3.2 million cardholders’ unique facial features since 2013. - Does the scheme provide for live use of FRT?
No. Head 4 of the draft scheme prohibits the use of live FRT. But the scheme fails to state how long after material is recorded it could be subjected to an FRT search retrospectively. Without a time lag defined, a live FRT ban does not mean much as the time lag could be any time gap, however short. - But surely a ban on live FRT is a good thing. Isn’t retrospective FRT ok?
While it is welcome that the Minister has done a U-turn and decided to abandon her initial plan to introduce both live and retrospective FRT, the use of FRT in both scenarios represent a major interference with people’s rights. The risk of persistent tracking and its adverse impact on rights and democracy, associated with retrospective FRT, are “at least equivalent” with those of live FRT as the amount of imagery potentially available for retrospective attempts to identify, locate and track a person are always more numerous than those available at a single point in time for live attempts to identify, locate and track a person. As such, retrospective use of FRT makes it possible to draw a much more complete picture of the activities of any individual, thus representing a major interference with a person’s fundamental rights. Experts have warned the use of retrospective FRT “marks a step change in police surveillance capability that may fundamentally alter the balance of power between the state and its citizens”.
- Where do things currently stand with the Irish FRT bill?
We understand that the bill is currently being drafted and we don’t know when it will be published. In February the Oireachtas Justice committee highlighted serious discrepancies with the bill and issued a list of things that the Minister and her department had to urgently clarify.
They highlighted many issues, including but not limited to:
- The need for a “rationale” to introduce FRT in Irish policing to be published;
- There is a lack of clarity on the part of An Garda Síochána about how they intend to use FRT;
- That An Garda Síochána and the Department of Justice need to urgently clarify what facial image reference databases they intend on comparing images against;
- The need for the Minister for Justice to address FRT accuracy issues;
- The need for the Minister for Justice to address FRT discrimination and inherent bias concerns;
- The need to bring the legislation in line with EU law;
- The scheme is imprecise and unclear in terms of when FRT would be used; and
- There is a lack of clarity about the source(s) of imagery An Garda Síochána intends to use.
So what now?
If you’re as concerned about FRT as we are, share this page with your friends and family; talk to each other about the dangers of this technology; tell your public representatives that you oppose the introduction of this faulty, discriminatory and dangerous technology to Irish policing; and join our campaign by signing our petition.