Changeflow GovPing Data Privacy AI Chatbots Provide Biased Voting Advice, Ignor...
Priority review Notice Added Final

AI Chatbots Provide Biased Voting Advice, Ignoring Local Parties

Favicon for www.autoriteitpersoonsgegevens.nl Dutch DPA News
Published March 12th, 2026
Detected March 13th, 2026
Email

Summary

The Dutch Data Protection Authority (AP) released a study showing AI chatbots rarely recommend local political parties when providing voting advice. The AP warns that this bias makes chatbots unreliable voting aids and calls on providers to implement measures to prevent their systems from being used for voting advice, especially in light of the EU AI Act.

What changed

The Dutch Data Protection Authority (AP) has published a study revealing that AI chatbots, such as ChatGPT, Claude, Gemini, Grok, and Mistral, overwhelmingly fail to recommend local political parties when users seek voting advice for municipal elections. The study found that local parties were recommended as a first choice in less than one percent of cases, despite their significant electoral success. This bias is attributed to the training data predominantly featuring national parties and less extensive information on local ones, leading to a distorted representation of the political landscape.

The AP has reiterated its warning that AI chatbots are unsuitable as voting aids due to their unreliability, lack of transparency, and potential to provide incomplete or incorrect information, which risks influencing voters based on biased recommendations. The authority urges AI chatbot providers to implement measures to prevent their systems from offering voting advice, highlighting that under the EU AI Act, systems influencing voting behavior are classified as high-risk. Voters are advised to rely on verified sources like traditional voting aids, news media, and party manifestos, and to critically verify any information obtained from AI chatbots.

What to do next

  1. Review AI chatbot usage policies for election-related advice
  2. Educate users on the limitations and biases of AI chatbots for voting information
  3. Monitor compliance with the EU AI Act regarding high-risk AI systems influencing voting behavior

Source document (simplified)

AI chatbots ignore local parties when giving voting advice

12 March 2026 Themes: Coordination of algorithmic and AI supervision EU AI Act AI chatbots hardly ever mention local political parties when users ask which party they should vote for in municipal elections. This is shown by a study into the use of chatbots as voting aids conducted by the AP.

In less than one per cent of the voting advice given by chatbots, a specific local party was recommended as the first choice. This is in sharp contrast to the more than thirty per cent of the votes local parties obtained in the last municipal elections.

“AI chatbots are unreliable and present a distorted picture of the political landscape. When local parties are barely mentioned in voting advice, voters do not get a clear picture of the options they have”, said Aleid Wolfsen, Chair of the AP. On Thursday, he presented the investigation report to the Minister of the Interior and Kingdom Relations, Pieter Heerma.

Lack of information about local parties

One possible explanation is that the chatbots have mainly been trained using large amounts of data from the Internet. Local political parties are relatively rarely presented in this data. Furthermore, information about these parties is often less extensive or less up-to-date than information about national parties. This can result in local parties being unintentionally disadvantaged when chatbots provide voting advice.

Chatbots often do provide voting advice

For the study, the AP tested five widely used AI chatbots: ChatGPT, Claude, Gemini, Grok and Mistral. Even though AI chatbots are not suitable for providing voting advice, they did so in the end, except in only seven percent of the cases.

Three of the five chatbots nearly always provided voting advice when explicitly requested by users, often with a clear preference for one party. In some cases, chatbots initially stated having insufficient information to provide voting advice, but after looking up information on the Internet, did so anyway.

Risk to voters and the democratic process

The results of this study once again show that AI chatbots are not suitable voting aids, just as the previous AP study in the run-up to the 2025 general elections did. The voting advice is not transparent, which means that voters do not know why they are receiving a particular recommendation, and it is difficult to verify. This creates the risk of voters casting their votes based on incomplete or incorrect information.

A previous investigation by the AP also showed that voting recommendations provided by AI chatbots present a highly distorted and polarised view of the political landscape. With this new study, the AP also demonstrates that there is still no protection in place to prevent chatbots from giving voting advice. When providing voting advice, chatbots also appear to include information that should be irrelevant, such as the voter's place of residence.

The AP is therefore once again calling on providers of AI chatbots to implement measures to prevent their systems from being used for voting advice. Under the European AI Act, AI systems that aim to influence voting behaviour are classified as ‘high-risk systems’ and are subject to very strict rules.

Advice to voters

Once again, the AP is warning people not to use chatbots for voting advice. There are reliable and verified sources for information and advice, such as traditional voting aids (StemWijzer and Kieskompas), news media, and political party manifestos. It is important to use multiple sources and to always thoroughly verify information from AI chatbots.

Source

Analysis generated by AI. Source diff and links are from the original.

Classification

Agency
Various
Published
March 12th, 2026
Instrument
Notice
Legal weight
Non-binding
Stage
Final
Change scope
Substantive

Who this affects

Applies to
Consumers Technology companies
Geographic scope
Netherlands

Taxonomy

Primary area
Public Health
Operational domain
Compliance
Topics
Artificial Intelligence Elections Data Privacy

Get Data Privacy alerts

Weekly digest. AI-summarized, no noise.

Free. Unsubscribe anytime.

Get alerts for this source

We'll email you when Dutch DPA News publishes new changes.

Free. Unsubscribe anytime.