Europe should follow Africa’s lead on tech and elections

5 March 2024

This morning ICCL Enforce wrote to Thierry Breton, European Commissioner for the Internal Market, urging that Europe follows Africa's example in protecting elections from digital threat. The African Union has introduced new rules for digital media that prohibit TikTok, Facebook, YouTube, and other companies from building up specific data about users’ sexual desires, political and religious views, health conditions and or ethnicity, and using that data to pick what videos and stories are shown to people. The new African rules also prohibit broadcasts of data about voters by the online advertising technologies like "Real-Time Bidding". Enforce urges Europe to follow Africa's lead.

For comment: Dr Johnny Ryan, Director of Enforce & Senior Fellow 

For media queries: ruth.mccourt@iccl.ie / +353874157162 

Image

Thierry Breton
Commissioner for Internal Market

cc
Dr Rita Wezenbeek
Director, European Commission DG Connect F 
Anu Talus
Chairperson, European Data Protection Board

5 March 2024

African Union innovation to
protect elections
in the context
of digital platforms

Dear Mr Commissioner,

  1. We support your vital work to protect Europe’s elections, and your personal commitment to confronting pernicious “big tech” business models that threaten our society and way of life. Please regard this letter as a contribution to your Directorate General’s current consultation,[1] and to your personal deliberations on this pressing matter. We draw your attention to an important initiative of the African Union.

  2. On 3 November, all African electoral authorities unanimously adopted the “Principles and Guidelines for the Use of Digital and Social Media in Elections in Africa”, which we have enclosed for your convenience. These guidelines were only recently made public.

  3. We highlight two elements in the African Union Guidelines that should positively inform the European Commission’s forthcoming Guidelines: 

    Element 1. Effective controls on recommender systems; and
    Element 2. Prohibition against adtech broadcast (“RTB”) of data about voters.

Element 1: Effective controls on recommender systems

  1. The African Union Guidelines specify measures to prevent recommender systems from amplifying disinformation and misinformation:

11.13   Recommender systems must not process any data that can be associated with a person and that could categorise their sensitive personal characteristics, including the following categories of information:

(a) Religious or philosophical beliefs
(b) Race or ethnic origin
(c) Trade union membership
(d) Political persuasion 
(e) Health or sex life

11.14   The exception to this prohibition in subsection 11.13 above is where the individual concerned has given their specific consent for the use of the specific category of information. In this case, the digital service provider must present, at all times and places where the recommender system is active, the means for the user to switch off the recommender system again. These means must be highly visible and immediately accessible.

Special categories of personal data

  1. The “sensitive personal characteristics” in 11.13 of the African Union Guidelines are closely modelled on the definition of “special categories of personal data” in Article 9 of Regulation (EU) 2016/679. This is significant. The European Court has confirmed a wide interpretation of special categories of personal data.[2] Thus, for example, data as routine as recording where a person decides to pause a video indirectly reveals special category data if that video is sexual. It is inevitable that recommender systems rely on the processing of special category data. 

  2. It is a well-established principle of EU Law that companies must carefully control, monitor, and account for their use of special categories of personal data, distinct from other personal data.[3] Therefore, very large online platforms are required to have already implemented the necessary distinctions in how their systems handle different types of data. The approach taken in the African Union Guidelines would therefore create no new technical requirement on the part of the companies concerned. 

  3. From the perspective of digital platforms, Regulation (EU) 2016/679 prohibits the processing of special categories of personal data unless a person has given their explicit (two-step) consent for it.[4] To our knowledge this two-step consent has been neither sought nor obtained. Therefore, the African Union Guidelines on recommender systems are compatible the existing prohibition of such systems being on by default under the GDPR. This should be made explicit in the Commission’s forthcoming Guidelines.

Profiling 

  1. Article 38 of the Digital Services Act provides that recommender systems based on a profile must be optional. Regulation (EU) 2016/679 imposes further legal requirements on companies that implement recommender systems that involve profiling data subjects, before they initiate data processing: 

    1. Conduct a Data Protection Impact Assessment (Article 35(3)(a)), and notify a data protection supervisory authority of any risk that can not be mitigated (Article 36);
    2. Demonstrate a lawful basis for the specific purposes for which they intend to conduct profiling (Article 5(1)b and Article 7);
    3. Ensure the technical ability to discontinue the profiling when requested to do so by a person being profiled (Article 21 and Article 22); and
    4. Ensure the ability to delete the data concerned where necessary (Article 17).

Thus, very large online platforms and search engines must already have created the necessary systems to switch off profiling under existing EU Law. The approach taken in the African Union Guidelines therefore creates no new technical requirement on the part of the companies concerned.

Necessity of action 

  1. Recommender systems feed each voter a personalised diet of content estimated to provoke or outrage that person. This amplifies disinformation and misinformation from the small core group that would not otherwise be widely seen. As the European Commission’s report in August concluded, Russian disinformation about Ukraine was achieved by pro-Kremlin actors and “algorithmic recommendation by the platforms”.[5] 

  2. Earlier this month, Gartner published a ranking of the top risks identified by over 300 risk management leaders, risk professionals, auditors, and senior executives. Gartner ranked “Escalating Political Polarization” driven by "reinforcing social media algorithms", number 2 in its top 5 risks.[6] 

  3. Despite evidence and concern, the term “recommender system” is absent from the provisional draft “Guidelines for Providers of Very Large Online Platforms and Very Large Online Search Engines on the Mitigation of Systemic Risks for Electoral Processes”. We suggest this should be reconsidered. 

  4. Therefore, we urge the Commission to follow the African Union’s example: the Commission should insert into its forthcoming guidelines that recommender systems (based on a profile or based on special category data) should not operate except where a user has confirmed their decision to switch them on.  

  5. We also urge the Commission to use the procedure provided in Article 70(1)(e) of Regulation (EU) 2016/679 to request that the European Data Protection Board investigate the profiling and processing of special category data for recommender systems, and take action to protect fundamental freedoms.

Element 2: Prohibition against adtech broadcast (“RTB”) of data about voters 

  1. The African Union Guidelines specify safeguards to protect voters against disinformation and manipulation caused by the online advertising “Real-Time Bidding” (RTB) system:

2.3.4.1 Personal data broadcast

This is the passing of personal data (including individuals' online behaviour, their location in the real world and codes to identify them) to external entities by advertising technology companies. This exposes voters to profiling and manipulation.

9.5      The state should ensure that personal data is protected, that data holders protect and secure the data in their possession or under their control, that such data is not unlawfully shared with any third party, and that its use, availability and longevity is in line with data protection standards.

9.6       The state should ensure effective data protection under an independent data protection authority. All parties have a duty to comply with legislation on privacy and personal data protection throughout the electoral cycle and using the following international data protection standards…

11.15   A person may consent to the use of their data for advertising by a specific digital or social media entity. In this case, the digital service provider should completely anonymise that data before sharing it with any other entity (including entities within the same company) so that it can never be linked to that person again. 

Necessity of action

  1. The online advertising “Real-Time Bidding” (RTB) system broadcasts intimate data about voters very widely, and without security, from almost all advertising-funded websites and apps. This exposes voters to disinformation, misinformation and manipulation. 

  2. This data free-for-all also allows a data arbitrage wherein disinformation media monetize data about the people who visit high quality journalism media. This undermines trustworthy news in two ways. First, journalism publishers’ ability to charge adequate prices for their advertising space is undermined, because a person who is attractive to an advertiser by virtue of their presence on high quality media sites/apps can be re-targeted very low quality alternatives. Second, it provides advertising revenue for disinformation media that would not otherwise have a viable business. RTB is the cancer at the heart of high quality journalism, and the lifeblood of disinformation clickbait. 

  3. Almost every time a voter loads a page on a website or uses an app, an RTB auction occurs to determine what ad will appear in front of them. This auction broadcasts what the voter is doing online, and where they physically are, to many other companies in order to solicit their bids for the opportunity to show the voter their ad. This hands the private data of voters to firms across the globe (including Russia and China) without any control over what is then done with the data.[7]  

  4. RTB data include what voters read, watch, and listen to, inferences about their political views, sexual preferences, religious faith, ethnicity, health conditions, and where they physically are - sometimes right up to their GPS coordinates. It also includes ID codes about them that help tie together many pieces of RTB data over time, so that very intimate profiles can be maintained about them. According to industry data we have obtained this happens to the average person in Europe 376 times per day.[8]  

  5. The RTB data free-for-all enables malicious actors inside and outside the Union to build intimate dossiers on each voter, exposing them to automated manipulation. This is not a new hazard. For example, we previously highlighted an RTB campaign in the 2019 Polish elections that targeted 1.4 million voters based on their sexual orientation.[9] 

  6. We urge the Commission to follow the African Union’s example. The Commission should insert the following into its forthcoming guidelines: First, no personal data, including special categories of personal data, should be processed within a group of undertakings (controlling or controlled VLOP/VLOSE undertakings)[10] or further shared to other entities. Second, solely anonymised data that are entirely unlinkable to a person should be processed within a group of such undertakings or further shared to other entities for those purposes instead. Online advertising can operate without widely European people’s broadcasting personal data. 

  7. The widespread broadcasting of personal and special category data is a data breach that infringes the well-established “confidentiality and integrity” principle, also known as the “security” principle (Article 5(1)f of Regulation (EU) 2016/679). 

  8. Therefore, we also urge the Commission to use the procedure provided in Article 70(1)(e) of Regulation (EU) 2016/679 to request that the European Data Protection Board investigate RTB broadcasting of personal and special category data, and take appropriate action to protect fundamental freedoms. 

  9. Protecting our elections and way of life requires not incrementalism, but urgent and decisive action. The African Union Guidelines provide clear, sound, and concrete measures to protect Europe. We urge you to include them in your Guidelines. In addition, we hope that you will invite the EDPB to contribute to the safety of the Union. We are at your disposal to assist in your deliberations.

Sincerely,

Dr Johnny Ryan FRHistS
Director, ICCL Enforce

Enclosure:
Principles and Guidelines for the Use of Digital and Social Media in Elections in Africa.

Notes

[1] https://digital-strategy.ec.europa.eu/en/news/commission-gathering-views-draft-dsa-guidelines-election-integrity

[2] See for example judgement of 1 August 2022, Vyriausioji tarnybinės etikos komisija, C‑184/20, EU:C:2022:601, paragraphs 125-7.

[3] See for example judgement of 4 July 2023, Bundeskartellamt v Meta, EU:C:2023:537, C‑252/21, paragraph 89.

[4] Article 9, Regulation (EU) 2016/679; and “Guidelines 05/2020 on consent under Regulation 2016/679”, European Data Protection Board, 4 May 2020 (URL: https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_202005_consent_en.pdf ), pp. 20-22.

[5] “Digital Services Act: Application of the Risk Management Framework to Russian disinformation campaigns”, European Commission, 30 August 2023 (URL: https://op.europa.eu/en/publication-detail/-/publication/c1d645d0-42f5-11ee-a8b8-01aa75ed71a1/language-en), p. 64.

[6] https://www.gartner.com/en/newsroom/press-releases/2024-02-07-gartner-survey-shows-political-polarization-is-a-new-top-emerging-risk-for-enterprises

[7] https://www.iccl.ie/wp-content/uploads/2023/11/Europes-hidden-security-crisis.pdf

[8] https://www.iccl.ie/wp-content/uploads/2022/05/Mass-data-breach-of-Europe-and-US-data-1.pdf

[9] https://www.iccl.ie/digital-data/rtb-data-breach-2-years-on/

[10] In the meaning of Recital 37, Regulation (EU) 2016/679.