Training & Coaching

A Critical Look at the DSA’s First Risk Assessments: Civic Discourse and Electoral Processes

An initial analysis on the first round of risk assessments under the EU Digital Services Act.

by LibertiesEU

In November 2024, the first-ever risk assessments required under the Digital Services Act (DSA) were published by 19 Very Large Online Platforms and Very Large Online Search Engines (VLOPs and VLOSEs). These reports - meant to be a turning point in platform accountability, providing transparency on how digital services impact democracy, civic discourse, and electoral integrity - did not live up to expectations.

Liberties and the European Partnership for Democracy (EPD) have closely examined these reports and uncovered serious gaps in how platforms assess and address the risks they create. While the reports focus heavily on disinformation and election integrity, they fail to tackle broader, systemic threats, such as the role of algorithms in shaping political discourse, the suppression of civil society voices, and the risks of polarisation.

Why This Matters

Social media and search platforms play a critical role in shaping public debate, influencing what people see, who they engage with, and how they access political information. For years, civil society organisations, researchers, and advocacy groups raised concerns about the impact of algorithms and content moderation policies on our civic life. The DSA was designed to bring more accountability and oversight, ensuring that very large online platforms take responsibility for the risks they pose. However, our analysis of the first set of risk assessments of Facebook, Instagram, Google Search, YouTube, TikTok, and X shows:

  • A lack of consistency. Each platform followed a different structure and methodology, making it difficult for watchdogs to analyse the assessments.
  • An underexamined role of recommender algorithms. Recommender systems shape online discourse by prioritising certain content over others, yet most platforms provided minimal details about how these systems function. The impact of algorithmic amplification of certain content over others on civic discourse remains a major blind spot.
  • Transparency gaps. Platforms provided very little detail on the nature and impact of the mitigation measures they introduced, making it virtually impossible to assess whether they are reasonable, proportionate and effective.

Critically, we also observed an imbalance in focus. While election integrity and misinformation are undoubtedly important, platforms largely neglected broader threats to civic discourse. Key issues such as the suppression of political content, influencer-driven misinformation, and the targeting of civil society organisations received minimal attention.

Our Key Recommendations

To improve platform accountability and truly protect democratic discourse, we urgently call for:

  1. Clearer guidance from the European Commission to standardise reporting structures and transparency requirements.
  2. Comprehensive analysis of recommender systems in the risk assessment report, making algorithmic decision-making more transparent.
  3. Greater disclosure from platforms on the nature and effectiveness of their mitigation measures.

We also urge platforms to take a broader approach to risk assessment, beyond just elections and disinformation, to encompass all aspects of civic discourse. To ensure a more comprehensive and effective process, platforms must actively engage a diverse range of stakeholders in future risk assessments and mitigation efforts, helping to prevent critical gaps and oversights.

The DSA could take us a major step forward, but without meaningful scrutiny and enforcement, its impact will remain limited. Civil society, researchers, and policymakers must work together to ensure platforms don’t just check compliance boxes, but actually improve the health of our digital public sphere.

Read our full policy brief to explore our findings and recommendations in detail.

Donate to liberties

Your contribution matters

As a watchdog organisation, Liberties reminds politicians that respect for human rights is non-negotiable. We're determined to keep championing your civil liberties, will you stand with us? Every donation, big or small, counts.

We’re grateful to all our supporters

Your contributions help us in the following ways

► Liberties remains independent
► It provides a stable income, enabling us to plan long-term
► We decide our mission, so we can focus on the causes that matter
► It makes us stronger and more impactful

Your contribution matters

As a watchdog organisation, Liberties reminds politicians that respect for human rights is non-negotiable. We're determined to keep championing your civil liberties, will you stand with us? Every donation, big or small, counts.

Subscribe to stay in

the loop

Why should I?

You will get the latest reports before everyone else!

You can follow what we are doing for your right!

You will know about our achivements!

Show me a sample!