Report September 2025
Submitted
Your organisation description
Crisis and Elections Response
Crisis 2025
[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].
Threats observed or anticipated
- Impersonation and inauthentic TikTok political accounts violating TikTok’s policies
During the reporting period, DRI advanced its monitoring of TikTok “murky accounts”- a term we use for accounts with unclear affiliations and questionable authenticity that actively promote political parties and candidates. We flagged 606 such accounts across the German (138), Romanian (323), and Polish (145) elections, of which 414 were removed by TikTok, following an internal review based on its Terms of Services and Community Guidelines. Our monitoring covered accounts supporting candidates and parties across the political spectrum.
In Germany, 69% of flagged accounts promoted AfD politicians and party content and falsely presented themselves as official party pages. In Romania, murky accounts most frequently supported Călin Georgescu (35%), liberal party candidate Elena Lasconi (14.2%), or George Simion of the right-wing Alliance for the Union of Romanians (11%). 21 murky accounts (8.3%) supported independent candidate, Nicușor Dan, who ultimately won the 2025 election in the second round.
In Poland, far-right candidate Sławomir Mentzen had the highest murky account support (21 accounts), followed by extreme-right candidate Grzegorz Braun (15 accounts). The conservative Law and Justice (PiS) and far-right Konfederacja parties also received significant murky account backing. Notably, accounts supporting Konfederacja generated over 12 times more engagement than those linked to the second most-engaged politician or party.
- Scroll for a Fake: TikTok Murky Accounts Impersonate German Parties and Politicians Ahead of Elections | 18.02.2025
- Scroll, Like, Deceive: Murky Political Accounts on TikTok before the German 2025 Elections|21.03.2025
- 323 murky accounts and one denied candidacy: TikTok's role in Romania’s 2025 election |11.06.2025
- Unverified and Unchecked: Murky TikTok Accounts in Poland’s 2025 Elections | 18.06.2025
- Chatbots misinforming about the German election, and prevalence of generative AI in campaigns
Since 2024, DRI has evaluated and monitored large language model (LLM)-powered chatbots’ ability to deliver accurate election-related information in multiple languages and regions. Ahead of the 2025 German elections, we continued our monitoring efforts by assessing six chatbots (ChatGPT 4.0, ChatGPT 4.0 Turbo (available to those who subscribe to ChatGPT premium), Gemini, Copilot, Grok, and Perplexity.AI) in both German and English on a total of 22 questions about the electoral process and key political topics in Germany. Overall, we found that chatbot providers have made progress in line with some DRI recommendations, including refraining from answering, and those provided by EU Commission guidelines, such as referring to voting advice and to sources provided by electoral authorities. The issue of providing misleading or incomplete information persists, however, suggesting that most major AI providers have not put robust risk mitigation systems in place.
Chatbots were not the only way in which AI directly impacted the outcome of electoral processes in the EU. In Germany, the far-right party, AfD, frequently used AI-generated images and videos, often without disclosure. These visuals served multiple purposes, from attacking political opponents to reinforcing the narrative of a Germany in decline. The research highlights how these tactics blend AI-generated misinformation, emotional priming, and aggressive political attacks to drive engagement. In Poland, candidates also leveraged generative AI in their campaign.
- Inconsistent and Unreliable: Chatbots Provide Inaccurate Information on German Elections | 12.02.2025
- The AfD on Facebook: Fear, Anti-CDU posts and Abuse of AI | 03.03.2025
- Engagement Wars: Inside the Polish Presidential Campaigns on Social Media | 30.05.2025
- Algorithms and Agendas: The Digital Fight for Poland’s Presidency 2025 | 31.07.2025
- Recommender Systems and Electoral Integrity
As TikTok and Instagram increasingly play a role as prominent sources of political information, understanding their recommender algorithms is essential for ensuring users can maintain control over their feeds, encounter diverse perspectives, and engage meaningfully in democratic processes - particularly during elections. Throughout the 2025 German federal election, we manually collected videos from both platforms by creating five user profiles, each representing a plausible individual from across the German political spectrum with varying levels of political interest and distinct political leanings, to assess how these variables shape the amount of political content they encountered. Additionally, we considered the variance across both platforms. Our findings show TikTok pushes political content more aggressively than Instagram, but unevenly: users with strong leanings toward AfD or BSW received far more targeted recommendations, while centrist and left-leaning profiles saw fewer, less relevant videos. The least political users saw almost none.
Our analysis was further underpinned by a literature review on political exposure bias during the 2024 U.S. and 2025 German elections on TikTok, Instagram, and X. Results show that platforms amplified right-leaning or far-right content even for neutral users, most strongly on X and TikTok, and weakly on Instagram. Users aligned with far-right parties received more targeted recommendations, and less cross-party recommendations. Persistent political exposure bias may constitute a systemic risk to elections, when capable of fueling polarisation, radicalisation, and fragmentation - especially if driven by manipulation, inauthentic activity, or opaque platform practices. Unnotified downranking of political actors may also breach Article 17 of the DSA, which requires transparency in moderation decisions.
- Filtered for You: Algorithmic Bias on TikTok and Instagram in Germany | 10.04.2025
- Political Exposure Bias in Recommender Systems: A Review of Evidence from the U.S. and German Elections | 30.04.2025
- Flagging Reduced Platform Commitments Under the CoC
As the EU Code of Practice transitioned into a legally binding Code of Conduct under the DSA, major platforms scaled back commitments, particularly in fact-checking, political advertising transparency, and support for independent research. Microsoft, Google, and TikTok reduced or withdrew key measures, while all platforms abandoned Commitment 27 on facilitating researcher data access. These reductions, often vaguely justified, risk undermining the CoC’s effectiveness in combating disinformation and ensuring accountability.
- Big tech is backing out of commitments countering disinformation—What’s Next for the EU’s Code of Practice? | 07.02.2025
Mitigations in place
- Code of Conduct on Disinformation
DRI continued its reporting efforts under the Rapid Response System of the Code of Conduct on Disinformation, actively collaborating with signatories. We regularly attended coordination meetings and contributed by flagging and discussing content that potentially violated platforms’ terms of use, including the risks identified and mentioned above. We continued to directly share findings with platforms to push for platform improvement and accountability.
- Raised awareness about threats and built networks with relevant stakeholders through roundtables
During monitoring of the German, Romanian, and Polish elections, we collaborated with key stakeholders, including the Polish think-tank Institute of Public Affairs (IPA) on monitoring social media platforms surrounding the 2025 Polish presidential elections. Throughout an engagement analysis of the Polish Presidential elections, IPA found that Mentzen’s Facebook account experienced several unusual spikes in activity, with multiple posts receiving exceptionally high and evenly distributed engagement within minutes – far above platform norms, and at a greater rate than other candidates. This trend, combined with abrupt changes in follower growth and interaction rates ending abruptly mid-April, suggests Mentzen’s online presence may have been partially boosted by inauthentic or artificially generated traffic.
We also raised public awareness through three roundtables and network collaborations. During the German Elections we collaborated as part of the Counter Disinformation Network (CDN), a pan-European collaborative initiative designed to preempt, detect, and combat information manipulation. In this forum, we shared our research findings and advocacy work.
The first DRI roundtable was a two-hour, in-person event with around 20 EU NGOs examining major online electoral risks during the German elections, which focused on sharing and gathering insights from participants’ chatbot auditing and social media monitoring efforts during the German elections.
After the platform X failed to provide timely data access ahead of the German elections, DRI brought litigation against the VLOP, launching the first court case to test the DSA’s provisions. To examine the Berlin Regional Court’s ruling, DRI hosted a public webinar with legal and policy experts, highlighting the case’s implications for future platform accountability and data access. Key takeaways were published to support other CSOs considering similar advocacy.
In September, we hosted Retrospective Insights: Election Monitoring Efforts to Preserve Information Integrity, presenting findings from our two-year access://democracy project monitoring online discourse during key EU elections. Nearly 30 representatives of civil society organisations and monitoring agencies joined to share the results of their social media monitoring efforts and assess whether platforms are meeting new DSA obligations. The discussion strengthened cross-organisation collaboration, with shared best practices informing a forthcoming meta-analysis that will be made publicly available to support peer capacity building.
- Elections, Algorithms, and Accountability: Digital Platforms and the 2025 German Federal Elections | 17.03.2025
- DSA in Court: What we learned from suing X | 17.07.2025
- Retrospective Insights: Election Monitoring Efforts to Preserve Information Integrity | 04.09.2025
- Communicating continued data access challenges
DRI leveraged op-eds to raise public and policymaker awareness about ongoing barriers to platform data access under the DSA. In Unpacking TikTok’s Data Access Illusion, we exposed the shortcomings of TikTok’s Virtual Compute Environment, showing how its restrictive design renders it functionally unusable for meaningful research. In Why We’re Suing Elon Musk’s X for German Election Data, we explained our landmark legal case against X for failing to provide timely access to publicly available data, highlighting its implications for DSA enforcement and European research capacity. Together, these op-eds amplified the importance of effective data access.
- Unpacking TikTok’s Data Access Illusion | 12.06.2025
- Why we're suing Elon Musk's X for German election data | 27.02.2025