On 26 April 2024, the European Commission (Commission) formally adopted guidelines (Guidelines) for Very Large Online Platforms and Search Engines (VLOP/SEs – platforms/search engines with more than 45 million EU active users) to help them comply with their Digital Services Act (DSA) duties to help protect the integrity of elections. For further information on the DSA click here.
Under the DSA, VLOP/SEs (who are exclusively policed by the Commission) have to meet onerous requirements to tackle illegal and harmful online content, including undertaking risk assessments and implementing measures to mitigate major societal risks – such as “systemic risks” to EU users that may impact the integrity of elections.
The EU has made it clear that protecting the integrity of elections is one of the key priorities for the enforcement of the DSA. Given the high number of elections taking place in the EU in 2024, including the upcoming European elections, and evidence of past electoral interferences, it is unsurprising to hear that the Commission is “carefully monitoring” the measures being taken by VLOP/SEs to prevent negative effects on democratic processes, civic discourse and electoral processes in the context of the DSA. The BBC recently reported that the use of AI deepfakes are “blurring reality” in the upcoming Indian elections and its Electoral Commission has stated that fake news has the potential to “set the country on fire” (a problem no doubt exacerbated by the lack of regulation in India).
Back in the EU, the Guidelines provide clarification on the obligation for VLOP/SEs to carry out risk assessments and to implement reasonable, proportionate, and effective mitigation measures for risks related to electoral processes, as required by Articles 34(1)(c) and 35 of the DSA.
The specific mitigation measures to be taken by a VLOP/SE will depend on the specificities of their service and risk profile. However, the Guidelines provide “best practices” to address systemic risks that could impact the integrity of democratic electoral processes “at this moment in time”. Such best practices are recommended to be taken before, during and after electoral events.
Recommended best practices include:
- The reinforcement of internal processes including by setting up internal teams with adequate resources;
- The implementation of election-specific risk mitigation measures tailored to each individual electoral period and local context;
- The clear labelling of political advertising in anticipation of the new regulation on the transparency and targeting of political advertising;
- The adoption of specific mitigation measures linked to generative AI – e.g. by clearly labelling content generated by AI (such as deepfakes), adapting terms and conditions accordingly, and enforcing them adequately;
- Cooperation with EU level and national authorities, independent experts, and civil society organisations, including the European Digital Media Observatory (EDMO) hubs and independent fact-checking organisations to foster an efficient exchange of information before, during and after an election and facilitate the use of adequate mitigation measures, including in the areas of Foreign Information Manipulation and Interference (FIMI), disinformation and cybersecurity;
- The adoption of specific measures, including an incident response mechanism to reduce the impact of incidents that could have a significant effect on the election outcome or turnout; and
- The assessment of the effectiveness of the measures through post-election reviews and publication of a non-confidential version of such post-election review documents to allow public feedback on the risk mitigation measures put in place.
Whilst VLOP/SEs have some flexibility in how they meet their DSA obligations in relation to systemic risks, the Commission has stated that VLOP/SEs who do not follow these Guidelines “must prove to the Commission that the measures undertaken are equally effective in mitigating the risks”. If the Commission receives information casting doubt on the suitability of such measures, it can make a request for information (RFI) or start formal proceedings against a VLOP/SE under the DSA (it already has made numerous RFIs and commenced proceedings against a growing number of VLOPs for a range of alleged DSA failings - for one such example see our article here).
The Commission has engaged in election integrity readiness dialogues with several VLOP/SEs ahead of national elections in order to monitor their effective compliance with the DSA. It also announced plans to carry out so-called “stress tests” at the end of April 2024 with some major platforms to exercise the most effective use of the instruments and the co-operative mechanisms that have been implemented. Meta, TikTok and X previously requested such voluntary exercises (overseen by the Commission's enforcement team) to check whether their operations complied with the DSA…So watch this space for further DSA related updates.
“With the Digital Services Act, Europe is the first continent with a law to address systemic risks on online platforms that can have real-world negative effects on our democratic societies. 2024 is a significant year for elections. That is why with today’s guidelines we are making full use of all the tools offered by the DSA to ensure platforms comply with their obligations and are not misused to manipulate our elections, while safeguarding freedom of expression.” Thierry Breton, Commissioner for Internal Market
https://ec.europa.eu/commission/presscorner/detail/en/ip_24_1707