(BRUSSELS) – The EU Commission published guidelines Tuesday for designated ‘very large online platforms’, to help them mitigate systemic risks online that may impact the integrity of upcoming elections.
Under the Digital Services Act (DSA), designated services with more than 45 million active users in the EU have the obligation to mitigate the risks related to electoral processes, while safeguarding fundamental rights, including the right to freedom of expression.
“Ahead of crucial European elections, this includes obligations for platforms to protect users from risks related to electoral processes like manipulation, or disinformation,” said EC vice-president Margrethe Vestager: “Today’s guidelines provide concrete recommendations for platforms to put this obligation into practice.”
The guidelines include specific measures ahead of the upcoming European elections. Given their unique cross-border and European dimension, Very Large Online Platforms and Search Engines should ensure that sufficient resources and risk mitigation measures are available and distributed in a way that is proportionate to the risk assessments. More specifically, they should:
- Reinforce their internal processes, including by setting up internal teams with adequate resources, using available analysis and information on local context-specific risks and on the use of their services by users to search and obtain information before, during and after elections, to improve their mitigation measures.
- Implement elections-specific risk mitigation measures tailored to each individual electoral period and local context. Among the mitigation measures included in the guidelines, Very Large Online Platforms and Search Engines should promote official information on electoral processes, implement media literacy initiatives, and adapt their recommender systems to empower users and reduce the monetisation and virality of content that threatens the integrity of electoral processes. Moreover, political advertising should be clearly labelled as such, in anticipation of the new regulation on the transparency and targeting of political advertising.
- Adopt specific mitigation measures linked to generative AI: Very Large Online Platforms and Search Engines whose services could be used to create and/or disseminate generative AI content should assess and mitigate specific risks linked to AI, for example by clearly labelling content generated by AI (such as deepfakes), adapting their terms and conditions accordingly and enforcing them adequately.
- Cooperate with EU level and national authorities, independent experts, and civil society organisations to foster an efficient exchange of information before, during and after the election and facilitate the use of adequate mitigation measures, including in the areas of Foreign Information Manipulation and Interference (FIMI), disinformation and cybersecurity.
- Adopt specific measures, including an incident response mechanism, during an electoral period to reduce the impact of incidents that could have a significant effect on the election outcome or turnout.
- Assess the effectiveness of the measures through post-election reviews. Very Large Online Platforms and Search Engines should publish a non-confidential version of such post-election review documents, providing opportunity for public feedback on the risk mitigation measures put in place.
The guidelines take into account the input received from the public consultation launched by the Commission on 8 February 2024 and also build on the ongoing work under the Code of Practice on disinformation. The Commission also cooperated with Digital Services Coordinators in the framework of the European Board for Digital Services on the guidelines.
Guidelines on the mitigation of systemic risks for electoral processes