Image description

THE emergence of artificial intelligence brings both promise and peril to the democratic process as Bangladesh prepares for its national and local elections. While artificial intelligence offers tools for efficiency, communication and data management, it also poses threats to electoral integrity, political stability and social trust. Without proactive measures, the technologies could amplify vulnerabilities in the political culture, deepening polarisation, spreading disinformation and undermining public confidence in the electoral process.

Ìý


AI in political campaign

ARTIFICIAL intelligence is rapidly transforming how political actors campaign, communicate and compete. From personalised advertising on social media to AI-generated videos and voice deepfakes, the tools of persuasion are powerful, scaleable and, crucially, difficult to trace. In the context of Bangladesh, where digital literacy remains uneven and regulatory oversight is weak, the tools can be easily weaponised to distort information, mislead voters and manipulate public opinions.

AI-powered micro-targeting allows political actors to deliver tailored messages to individual voters based on their browsing history, religious identity, income level or perceived political leanings. While this might seem like efficient campaigning, it can also be used to spread false or divisive narratives to specific communities without public accountability.

Moreover, generative artificial intelligence now makes it easy to create deepfake videos of political figures saying or doing things that they have never said or done. Such content can be deployed to smear opponents, sow confusion or incite violence. In a highly charged political environment like Bangladesh’s, even a short clip circulating on Facebook or YouTube for a few hours can do lasting damage.

Ìý

Fertile ground for digital manipulation

BANGLADESH’S political environment is already characterised by intense partisanship, low trust in institutions and frequent allegations of electoral manipulation. Artificial intelligence-based tools can magnify the tension.

Misinformation and disinformation: Political actors and their supporters may use artificial intelligence to generate misleading or fake news stories, doctored images or forged documents. They can spread rapidly on social media platforms such as Facebook, WhatsApp and TikTok, especially in rural or peri-urban areas where users may not have the skills or time to verify authenticity.

Psychological operations: Automated bots and troll farms can conduct coordinated influence operations to harass opponents, suppress dissent or create an illusion of consensus. Artificial intelligence tools can be programmed to flood platforms with narratives favouring those in power or discrediting opposition groups, potentially skewing online public discourse and marginalising independent voices.

Vote suppression tactics: Artificial intelligence could be used to target specific voter groups with false information about voting dates, locations or eligibility criteria. For example, artificial intelligence chatbots or auto-generated messages could deliberately mislead marginalised or minority communities, discouraging them from participating in elections altogether.

These are not hypothetical risks. Similar tactics have already been noticed in countries such as the Philippines, Kenya, India and the United States where artificial intelligence-driven propaganda and digital manipulation have disrupted democratic processes. Bangladesh, with its growing but under-regulated digital ecosystem, must act now to avoid similar outcomes.

Ìý

Regulatory challenges

ONE of the biggest challenges in addressing artificial intelligence-related electoral risks in Bangladesh is the lack of preparedness among regulatory institutions. The Election Commission, constitutionally mandated to ensure free and fair elections though, lacks the technological infrastructure or trained personnel to detect and respond to artificial intelligence-driven manipulation.

Meanwhile, our legal frameworks such as the Cyber Security Act are more geared towards controlling dissent than protecting electoral integrity. The laws do not clearly define the use or misuse of artificial intelligence in political campaigns nor do they provide mechanisms for oversight or public redress.

The private tech sector, including platforms such as Meta (Facebook), Google, and TikTok, plays a central role in how political content is disseminated in Bangladesh. Yet, there is minimal transparency about how the companies moderate political content, especially content generated with artificial intelligence. In the absence of mandatory disclosures or data-sharing agreements with the Election Commission or civil society, their algorithms continue to operate in a black box, potentially amplifying harmful content during critical electoral periods.

Ìý

Electoral resilience

A MULTI-PRONGED strategy is essential to protect the credibility of Bangladesh’s elections in the era of artificial intelligence.

Electoral AI guidelines and protocols: The Election Commission, in partnership with civil society, academic institutions and international organisations, should develop clear guidelines on the ethical and legal use of artificial intelligence in political campaigns. The guidelines should define what constitutes artificial intelligence-generated content, what kinds of political manipulation are prohibited and what penalties apply for violations.

Such protocols should be published in advance of elections, with input from all political parties and digital platforms, to ensure fairness and clarity.

Real-time monitoring: A dedicated electoral technology monitoring cell should be established within the Election Commission to track, flag and investigate artificial intelligence-generated disinformation, deepfakes and coordinated bot activities. This unit should be staffed with data scientists, digital security experts and communication specialists who can work real-time during election cycles. Collaboration with fact-checking organisations and social media watchdogs will be essential to validate reports and intervene quickly.

Platform accountability and transparency: The government should mandate platform accountability and transparency to ensure a fair and transparent electoral process in the age of artificial intelligence. Major digital platforms must require to publicly disclose political advertisements run on their platforms, including detailed information on funding sources and targeting criteria. Additionally, the platforms should report on the specific measures that they take to moderate harmful or manipulated artificial intelligence-generated content, particularly content that could influence voter perception or disrupt public trust. Regular transparency reports outlining the nature and scale of artificial intelligence-driven disinformation trends should also be published. Bangladesh could explore regulatory frameworks similar to the European Union’s Digital Services Act, which obliges tech platforms to uphold rigorous transparency standards, especially during election periods. Such measures would help establish a system of accountability and serve as a deterrent to the misuse of artificial intelligence in the electoral landscape.

Awareness and digital literacy: The best defence against artificial intelligence manipulation is an informed electorate. Nationwide campaigns in schools, colleges and community centres and on local media should educate citizens on how to recognise fake news, deepfakes and artificial intelligence-driven disinformation.

Media literacy should be integrated into secondary school curriculums and youth organisations should be trained in serving as ‘digital democracy ambassadors’ in their communities.

Journalists, civic technologists and whistleblowers play a vital role in exposing electoral manipulation. Laws and institutional mechanisms must be strengthened to protect them from intimidation, surveillance or legal harassment, especially when they report on artificial intelligence misuse in political campaigns.

Artificial intelligence is not just a threat. It can also be part of the solution. Civic tech organisations should be supported in developing artificial intelligence tools that enhance voter information, counter disinformation and improve citizen access to election-related services.

Artificial intelligence chatbots could help voters to verify their registration status, locate polling booths or report irregularities, if designed transparently and used ethically.

Ìý

Resilient democratic future

THE issue facing Bangladesh is not whether artificial intelligence will impact elections. It already does. The issue is, rather, how we will respond to the changes. Will we allow powerful technologies to be exploited for short-term political gains or invest in the institutions, regulations and civic capacities needed to safeguard democracy in the digital age?

Bangladesh has made significant progress in expanding voter participation, digitising election processes and ensuring peaceful transfers of power in recent decades. The gains must not be reversed by the unchecked use of artificial intelligence and algorithmic manipulation. By taking a bold, coordinated and inclusive approach today, we can build an electoral system that is not only free and fair, but also future-ready.

Democracy may well depend on how we manage the intelligence, not just artificial, but human and collective, that we bring to the task.

Ìý

Musharraf Tansen is a doctoral researcher at the University of Dhaka.