Submission to the Irish Government on AI Act Implementation

17 July 2024

ICCL Enforce responded to the consultation from the Irish government on the national implementation of the EU AI Act. By 2 August 2025, the Irish government will appoint the regulators responsible to enforce the AI regulation. ICCL Enforce has suggested an enforcement structure and emphasised the need for adequate resources for the regulators to effectively enforce the law.

Image

Department of Trade, Enterprise and Employment  
23 Kildare Street,  
Dublin 2 

By email                                                                                                                              16 July 2024 

 

Submission to the Irish Government on AI Act Implementation 

Dear Colleagues,

  1. The Irish Council for Civil Liberties (ICCL) is Ireland’s oldest independent human rights organisation. We welcome the opportunity to provide inputs[1] to Ireland’s implementation of the EU AI Act.[2]

  2. We make recommendations on two topics:

a) AI Act national enforcement structure
b) Adequate resources for the regulators

AI Act national enforcement structure

  1. Ireland is losing credibility as the EU’s major tech regulator due to its lethargic enforcement of the GDPR.[3] The AI Act offers an opportunity to change that by entering a new chapter of robust enforcement. This is possible through a part-distributed, part-centralised enforcement structure.

  2. We suggest that, for the products in Annex I of the AI Act, the existing market surveillance authorities (MSAs) of those products be designated as the MSA[4] under the AI Act.

  3. For prohibited AI systems[5] and high-risk AI systems in Annex III, we suggest that a central and coordinating supervisory authority such as the Data Protection Commission (DPC) be designated as the MSA under the AI Act. However, there are at least two exceptions for this designation:

    a) high-risk applications in sectors where an MSA exists,[6] in which case the existing MSA should be designated; and

    b) high-risk AI systems used in financial services,[7] including for the assessment of creditworthiness in the financial sector,[8] in which case the Central Bank of Ireland, or the Financial Services and Pensions Ombudsman could be the MSA.

  4. The AI Act gives powers to fundamental rights bodies.[9] The Irish government must identify those bodies and notify the list to the European Commission and other EU countries by 2 November 2024. We suggest that, at a minimum, the Irish Human Rights and Equality Commission (IHREC) and the DPC be identified as authorities protecting fundamental rights under the AI Act. Fundamental rights bodies will play a critical role in evaluating the fundamental rights risks of AI systems. These bodies can identify risks in deployed AI systems, including AI systems that were initially deemed to be compliant,[10] and can require corrective action from the companies.

  5. A clear coordination structure among the MSAs, sector-specific regulators (such as in the employment sector) and the fundamental rights bodies is essential for effective enforcement. The MSAs should closely collaborate with sector-specific regulators and fundamental rights bodies during investigations, and establish a knowledge-sharing system. The exchange of information between MSAs, sector-specific regulators and fundamental rights bodies should be made possible by creating legal bases in the national law implementing the AI Act.

  6. Furthermore, we recommend that Ireland establish an advisory forum[11] consisting of civil society organisations with fundamental rights expertise and trade unions, as well as people who are often at the receiving end of AI deployment, such as teachers artists and actors. They have important insights into AI harms and incidents. By regularly engaging with the advisory forum, MSAs would be exposed to necessary perspectives that can advise their work.

Adequate resources for the regulators

  1. The culture at the MSAs will be critical for effective enforcement of the AI Act. The culture of MSAs should be investigative and sceptical. AI systems impact the lives of people in Ireland and Irish society. A soft-touch regulator will fail both to promote responsible use of AI and to protect people from the harms of AI.

  2. MSAs cannot fulfil their job of enforcing the law unless they have an adequate budget, capacity and skilled staff.

  3. Assessing AI systems on the market and their compliance with the AI Act will require in-depth technical knowledge. All MSAs, sector-specific regulators and fundamental rights bodies must have technical expertise at their disposal.

  4. We recommend that, at a minimum, the central and coordinating supervisory authority be a hub of expertise with an adequate number of technical experts employed to support the central authority and other MSAs, sector-specific regulators as well as the fundamental rights bodies.

  5. MSAs will also have to learn from fundamental rights bodies and upskill to assess the fundamental rights impacts of AI systems. The knowledge-sharing system can assist with this.

  6. We also recommend that Ireland establish a pool of independent technical, legal and fundamental rights experts on AI. Their expertise can be tapped into when specialised assistance is required by the MSAs and fundamental rights bodies. These experts should be chosen transparently and scrupulously with no member presenting an objective or perceived conflict of interest with the companies regulated within the scope of the AI Act.

  7. Finally, Ireland must establish operational AI regulatory sandboxes, at the very least one, by 2 August 2026. We recommend setting up a separate unit at the MSAs with dedicated resources and staff to supervise sandboxes. Resources and staff from the enforcement unit should not be diverted to supervise any sandboxes.

 

Sincerely,

Dr Kris Shrishak
ICCL Enforce Senior Fellow

 

Notes

 

[1] Public consultation on National Implementation of EU Harmonised Rules on Artificial Intelligence (AI Act), 21 May 2024. URL: https://enterprise.gov.ie/en/consultations/public-consultation-on-eu-ai-act.html

[2] Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act). URL: https://eur-lex.europa.eu/eli/reg/2024/1689/oj

[3] Don't be fooled by Meta’s fine for data breaches, says Johnny Ryan, 24 May 2023. URL: https://www.economist.com/by-invitation/2023/05/24/dont-be-fooled-by-metas-fine-for-data-breaches-says-johnny-ryan. Also see ICCL, Irish Big Tech enforcement in 2023, 29 May 2024. URL: https://www.iccl.ie/news/irish-big-tech-enforcement-in-2023/

[4] AI Act, Article 3 (26).

[5] AI Act, Article 5.

[6] The Dutch Data Protection Authority and Dutch Authority for Digital Infrastructure in their ‘1st (interim) advice supervisory structure AI Act’ identify critical infrastructure as being such a high-risk application.

URL: https://www.autoriteitpersoonsgegevens.nl/en/system/files?file=2024-06/20231107%20EN%201st%20%28interim%29%20advice%20supervisory%20structure%20AI%20Act.pdf

[7] AI Act, Article 74 (6) and (7) specify that the MSA for financial services shall be the MSA for that sector.

[8] AI Act, Annex III 5(b).

[9] AI Act, Article 77.

[10] AI Act, Article 82(1).

[11] Not to be confused with the AI Advisory Council established by the Department of Trade, Enterprise and Employment.