Leading experts warn against Garda use of FRT

Use of the “toxic” tool would result in a “massive step change” in police surveillance in Ireland

Speaking at the Policing Surveillance conference in Dublin, hosted by the Irish Council for Civil Liberties and the Committee on the Administration of Justice in Northern Ireland, Dr Abeba Birhane, Dr Daragh Murray and Dr Elizabeth Farries sounded the alarm on the technology’s inherent flaws and racial bias, and warned the negative impacts of the surveillance tool on Irish society may not be fully realised until it’s too late. 

During the discussion with chair Dr Farries, co-director of UCD’s Centre for Digital Policy, Dr Birhane, a senior advisor in AI accountability with the Mozilla Foundation and Adjunct Assistant Professor at Trinity College Dublin, who is an expert in analysing datasets and unveiling the racial and gender stereotypes and biases within Artificial Intelligence systems, showcased several crucial pieces of research. 

The studies demonstrated: 

Dr Birhane also highlighted how these technological biases and flaws have significant real-life downstream effects with people being misidentified and subsequently wrongfully arrested in the US – all of whom, known of to date, are Black people. Importantly, it’s unknown how many misidentified innocent people in the US, or elsewhere, may have taken a plea deal, whereby they pleaded guilty in order to avail of a lighter sentence. An emerging issue in the US is the lack of transparency around the use of this technology in law enforcement where the accused are not informed  FRT was used and the accused then being deprived of the opportunity to challenge its use.  

Research by Dr Abeba Birhane shared at the ICCL/CAJ Policing Surveillance conference

Dr Farries noted: “The impacts of inaccuracies and bias harm all of us through the results of an entrenched surveillance society but they attack, in particular, those who are already dealing with inequalities and discrimination.” 

Dr Murray, a senior lecturer at Queen Mary University in London School of Law and a fellow of the Institute of Humanities and Social Sciences, carried out the only independent review of the Metropolitan Police’s use of live FRT in London. He found it was only accurate 19% of the time.  

Speaking about the use of FRT in the UK, Dr Murray assessed the landmark Ed Bridges v South Wales Police case in which the UK Court of Appeal found the use of live FRT by the police was unlawful. He also explained why: 

  • FRT can extinguish the concept of anonymity by allowing police forces to not only monitor what happens in an area in real-time [live FRT], but also back in time [retrospective FRT].  
  • A legal challenge against the use of FRT in the UK is likely in the near future due to the very shaky legal ground upon which it is currently being deployed; and 
  • People should be concerned that some police units in the UK have expressed a desire to use FRT to monitor, track and profile people, in a manner similar to how FRT is used against the Uighur population in China. 

Dr Murray warned that while live FRT – where an FRT algorithm is applied to a surveillance camera feed in real-time – may be deemed more controversial by some, retrospective FRT – where an FRT algorithm is applied to pre-recorded material or a database of images – is potentially more problematic for fundamental rights. 

He said: “FRT, as a tool, enables a real step change in police surveillance capability. It’s something that was almost restricted to an intelligence context but is now coming into day-to-day policing.”  

“The application of FRT removes the possibility of anonymity within a city. It allows police forces to look at what happens in a city, not only at a moment, but back in time. That’s an incredibly powerful thing and I don’t think we fully understand the consequences of that.” 

“Essentially all retrospective FRT is the application of an algorithm through recorded information. If you have a citywide surveillance camera network that’s stored to a database, as soon as you record it to the database, you’re talking about retrospective, so you can apply the algorithm to it. Not immediately, not in real-time, but in very, very close-to-real-time and, in a way, that’s still capable of influencing real-time events. It’s those kinds of applications that we haven’t thought through.”  

“We can understand to a degree the impact on the right to privacy, we understand when it comes into conflict with the prohibition of discrimination. But what we don’t understand is the much broader, societal impact of these types of technologies.” 

“The example I always think of is the GDR [German Democratic Republic] and the extent of police surveillance there. What you could reveal about a population with huge, huge resources is tiny in comparison to what you can achieve with facial recognition and other forms of surveillance technology.”  

“It’s that really massive step change in state or police surveillance capability and how that affects how people interact… I don’t think we have any idea. There’s initial research on the chilling effects that shows it’s a problem but it’s very difficult to quantify and know where it’s going. We can see discrimination in the short-term and people being arrested. But we can’t see the chilling effect, it’s like the frog in the kettle, until it’s too late, and that’s a really big, big concern.” 

Panel speakers L-R: Dr Elizabeth Farries, Dr Daragh Murray and Dr Abeba Birhane

Last year, a number of researchers audited the use of live, retrospective and mobile phone FRT by police in England and Wales and found that all three deployments failed to meet the minimum ethical and legal standards for the governance of FRT

Earlier this year, Dr Birhane, Dr Farries, ICCL, Digital Rights Ireland, and other academics in Ireland considered this study and the safeguards that research recommended be strictly put in place before any deployment of FRT. Adopting those safeguards, they sent those assessment requirements, led by Dr Birhane, to the Data Protection Commission in respect of the proposed plans to enable Garda use of FRT. 

At the conference, Dr Birhane detailed some of those requirements which she said An Garda Síochána would have to answer before any deployment of FRT.  

They included: 

  • Are there clear, objective, and limited criteria concerning third-party access to the data collected or retained, including with regard to what data can be shared, with whom it can be shared, and for what specific purpose it can be shared? 
  • Has An Garda Síochána identified less intrusive alternative measures and proven that FRT is strictly necessary compared to these measures using scientifically verifiable evidence? 
  • Has An Garda Síochána pre-established minimum thresholds to be met for the FRT system’s accuracy (precision, false positive rate, true positive rate) to inform the legal test of strict necessity for personal data processing?  
  • Will all FRT materials be accessible for third-party independent auditors?  

Dr Birhane warned: “The gardai, or people contracted by the gardai, who say there is no bias in FRT raises a major suspicion. Any assessment has to be done by independent auditors who have no skin in the game in order for the results to be credible.” 

She added: “I consider myself an auditor  so I can vouch for the really critical importance of independent auditors and evaluators who need access to the datasets, to the models, and to all the information that’s available. Will the gardai commit to that access for independent auditors?” 

“This technology is relatively new. And as technology is moving really, really fast, there are new models, new datasets almost every other week. So even for a researcher, who’s in the midst of it, it’s really difficult to keep up with the latest state-of-the-art models. So given how these technologies advance, will there be training for the gardaí themselves in how the technology works and its potential for failure? And will there be, before the actual technology is deployed, some controlled trial to test, to see how it performs?” 

But she warned: “My own personal view is that it’s better not to implement the technology at all… because the technology is inherently flawed and extremely invasive. There is no way to develop a technology that has zero percentage of bias. It’s impossible. It’s technically impossible… what we are dealing with is a really toxic technology that should be discarded.” 

The academics’ comments come as the Irish government is preparing to publish a new bill, the Garda Síochána (Digital Management and Facial Recognition Technology) Bill 2023, to allow gardaí use FRT. 

The Department of Justice had initially planned to introduce this mass surveillance power by way of a last-minute amendment to the Garda Síochána (Recording Devices) Bill this year without any public consultation, pre-legislative scrutiny or democratic debate. That bill is vastly expanding Garda use of devices capable of recording people and their movements in public spaces by way of drones, body-worn cameras, animal-worn cameras; Automatic Number Plate Recognition (ANPR); and by gaining live access to third-party CCTV, thereby vastly expanding their access and collection of recorded material. 

Fine Gael climbed down from this position following expert pressure from 7 universities and 13 NGOs including ICCL, Digital Rights Ireland, leading academics and the Green Party and instead plan to introduce a standalone piece of legislation, the Garda Síochána (Digital Management and Facial Recognition Technology) Bill 2023.  

The Minister for Justice first announced plans to introduce both live and retrospective FRT in Irish policing in May 2022 but has also climbed down from this position. The minister has since said the FRT bill will allow for retrospective FRT alone, whereby gardai could apply an FRT algorithm to images already legally in the possession of An Garda Síochána. In September 2022, four UN Special Rapporteurs urged the Irish Government to halt their FRT plans

ICCL strongly opposes the use of live or retrospective use of FRT by An Garda Síochána and in 2021 called for a ban on police use of FRT

You can listen back to the facial recognition technology panel discussion from the Policing Surveillance conference below: