31 January 2024
ICCL, together with more than sixty other organisations across Ireland, has written to Coimisiún na Meán to highlight the need for robust measures to address the harms caused by "recommender system" algorithms. We recommend improvements to Coimisiún na Meán's draft Code for video platforms, including measures to ensure enforceability.
Included in this submission are stories of the impact of these algorimths on real peoples' lives, collected by our colleagues at Uplift:
“My beautiful, intelligent, accomplished niece was encouraged, incited to see suicide as a romantic way to end her life. She did end it. Earlier she had been encouraged to see more and more sites by people who espoused the idea that people suffering from mental health issues should stop their medications and force society to accept them as they were. This led her a dangerous downturn from which she never recovered, leaving her poor parents devastated and her family changed for the worse.”
“My father has slowly been radicalised by the content pushed to his feed on Facebook. He watches the short videos and accepts all the information in the video without any verification on his part. If you ask him to verify it, he calls you a liar. The videos can directly state conflicting information, but he will accept it all as fact without thinking about it. This is fuelling his anti immigration thoughts and ideas. I fear he'll become homophobic too.”
Our brief letter to Coimisiún na Meán is presented in full below, and the longer submission document is linked here.
For comment: Dr Johnny Ryan, Director of Enforce & Senior Fellow
For media queries: ruth.mccourt@iccl.ie / +353874157162
Press resources
- Full submission document https://www.iccl.ie/wp-content/uploads/2024/01/submission-60-civil-society-organisations-Coimisiun-na-Mean_OSC-Consultation-Response.pdf
- Brief cover letter https://www.iccl.ie/wp-content/uploads/2024/01/submission-Letter-accompanying-submission-31-January-2024.pdf
- Poll showing 82% of the Irish public support the measures on recommender systems https://www.iccl.ie/wp-content/uploads/2024/01/submission-Appendix-1-Ireland-Thinks-poll-on-Coimisuin-na-Mean-measures.pdf
- Image of logos of the 60+ signatory organisations https://www.iccl.ie/wp-content/uploads/2024/01/62-civil-society-organisations-.png

Commissioner Niamh Hodnett
Coimisiún na Meán
One Shelbourne Building
Shelbourne Road, Dublin 4
31 January 2024
Draft Online Safety Code
Dear Commissioner Hodnett,
We represent more than sixty organisations, together comprising a diverse cross-section of Irish society. We are united in strongly supporting your decision to require that recommender systems based on intimately profiling people are turned off by default on social media video platforms.[1]
Algorithmic recommender systems select emotive and extreme content and show it to people it estimates are most likely to be outraged. These people then spend longer on the platform, which allows Big Tech corporations to sell ad space. Meta's own internal research disclosed that a significant 64% of extremist group joins were caused by their toxic algorithms.[2] Even more alarmingly, Amnesty found that TikTok’s algorithms exposed a 13-year-old child account to videos glorifying suicide in less than an hour of launching the account.[3]
82% of people are in favour of your initiative in a national poll across all ages, education, income, and regions of Ireland conducted by Ireland Thinks in January 2024. We all agree that users - not Big Tech corporations’ algorithms - should be free to decide what they see and share.
Social media promised to be places where people choose what they share with their friends. Instead, it manipulates and addicts our children, promotes suicide and self-loathing among our teens, turns communities against each other, and feeds us a personalised diet of hate and disinformation for profit.
Our accompanying submission evinces the necessity of your initiative (part 1 of the enclosed submission proposes), presents steps to strengthen it (part 3), additional measures to supplement it (part 4), and enhancements to facilitate enforcement (part 5).
To conclude, civil society organisations across Ireland fully support your decision to require that recommender systems based on intimately profiling people are turned off by default on social media video platforms.
Signed
Irish Council for Civil Liberties
Hope & Courage Collective
Uplift
People vs Big Tech
Community Work Ireland
Galway City Community Network
Cork Rebels for Peace
Irish Network Against Racism
Afri
Doras
Action for Choice
Social Rights Ireland
Helping Irish Hosts
Empower
Outhouse LGBTQ+ Centre
ShoutOut
Leitrim Volunteer Centre
European Anti-Poverty Network Ireland
Human Rights Sentinel
Donegal Intercultural Platform
Inishowen Together
Black and Irish
Dublin City Community Cooperative
Bridging The Gap Ireland
Bray for Love
Irish Traveller Movement
Clare Immigrant Support Centre
Mammies for Trans Rights
Together for Safety
Droichead FRC
Age Action
LGBT Ireland
Migrant Rights Centre Ireland
IDEN, Irish Doughnut Economics Network
Dublin LGBTQ+ Pride
National Women's Council
Irish Council for International Students
New Horizon Refugee Support
Pavee Point Traveller and Roma Centre
Belong To - LGBTQ+ Youth Ireland
Solas Project
National Traveller Womens Forum
Waterford Integration Services
Nasc, the migrant and refugee rights centre
Fermoy and Mallow Against Division
Women for Election
Circle VHA
Climate Action Wexford
International Community Dynamics CLG
Dublin Bay South Branch Social Democrats
Wicklow Volunteer Centre
Light Advisory
Women's Collective Ireland (WCI)
Good Day Cork
Parable Communications
Suas/STAND
Rialto Youth Project
Independent Living Movement Ireland (ILMI)
The Exchange Inishowen
NeuroPride Ireland
Friends of the Earth Ireland
Enclosures:
Notes
[1] Facebook, Instagram, YouTube, Udemy, TikTok, LinkedIn, X, Pinterest, Tumblr, Reddit.
[2] “Facebook Executives Shut Down Efforts to Make the Site Less Divisive”, Wall St. Journal, 26 May 2020 (URL: https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499 )
[3] “Global: TikTok’s ‘For You’ feed risks pushing children and young people towards harmful mental health content”, Amnesty International, November 2023 (URL: https://www.amnesty.org/en/latest/news/2023/11/tiktok-risks-pushing-children-towards-harmful-content/).