
IN THE autumn of 1954, a contentious football match between Princeton and Dartmouth sparked a revelation about human perception. The unusually physical game led to accusations from both sides. A week later, psychologists Albert Hastorf and Hadley Cantril showed recordings of the game to students of both the schools. Princeton students saw Dartmouth players committing nearly all infractions while Dartmouth students perceived a more balanced game; some even believed Princeton was the primary aggressor. The same footage, the same reality, but completely different perceptions.
This was not merely team loyalty at work. It revealed something fundamental about being human: being biased; and we do not see the world as it is but as we are. Cognitive bias is the tendency to act in an irrational way due to our limited ability to process information objectively. It is not always negative, but it can cloud our judgement.
Have you ever looked for something online and then seen adverts for it everywhere on your screen? Watched one political video, then noticed your feed spiral deeper into similar, often extreme, content? That is not chance. That is re-targeting. It is the work of invisible systems engineered to capture and shape attention. Social media algorithms create echo chambers, reinforcing beliefs and distorting the sense of reality. This process, algorithmic gaslighting, exploits cognitive biases, fuelling division, outrage and misinformation. As artificial intelligence advances, so does our vulnerability.
Unlike previous generations, who accessed news through human editors with professional standards, we now receive information curated by algorithms optimised for engagement, not truth. The result? Fragmented societies trapped in separate realities where facts are debatable.
If we do not realise how others influence our thinking, we may end up giving them control over our thoughts, whether it is big tech companies, political groups or smart AI systems. Cognitive bias is not new. But in the digital age, it is no longer just a mental flaw. It is a weapon. And recognising it has become essential to preserving free thought and an honest society. Understanding bias today is no longer philosophical or merely acquiring knowledge; it is existential.
Ìý
Predictably irrational minds
EVERY day, the brain makes about 35,000 decisions. The vast majority happen automatically, beneath the conscious awareness. This mental efficiency serves us well most of the time. Imagine how exhausting life would be if every small choice that are made has to be deliberately thought through. But this automatic thinking comes with a price: systematic errors in judgement that scientists call cognitive biases.
The discovery of the mental blind spots challenged centuries of thinking about human rationality. Nobel prize-winning psychologist Daniel Kahneman and his research partner Amos Tversky demonstrated that thinking is not just occasionally flawed; it is predictably irrational in specific, measurable ways. ‘Human perception is not a direct window to objective reality,’ explains Kahneman. ‘It’s more like a sophisticated interpretation system that prioritises efficiency over accuracy.’
Since their landmark 1974 paper, researchers have identified more than 180 distinct cognitive biases — a troubling inventory of the ways minds routinely distort reality.
Confirmation bias: We seek what agrees with us and ignore what does not. A person supporting a political party in Dhaka will cherry-pick news that favours it, ignoring its corruption scandals. Social media turbocharges this, creating echo chambers where opposing views feel extreme and rare.
Availability bias: We fear what is vivid, not what is likely. People worry about dengue death splashed in the headlines but ignore the silent epidemic of diabetes. A fatal car accident in the news makes people scared to drive, but not eating fast food daily feels harmless.
Negativity bias: Nine praises, one insult — and you obsess over the insult. Whether in office evaluation or family drama, negative moments stick more. News outlets know this. Adding a negative word to a headline can boost clicks by more than 60 per cent.
Fundamental attribution error: When someone cuts you in traffic, they are ‘a jerk.’ When you do it, it is because you were late. We judge others by character, but ourselves by context. This bias erodes empathy and polarises society.
Anchoring bias: A panjabi seller in Bashundhara starts with Tk 15,000; even if you buy at Tk 10,000, the anchor sticks, even if it is worth only Tk 5,000. The first number shapes what feels ‘reasonable.’
Status quo bias: Many families continue treating asthma with home-made smoke rather than inhalers. ‘This is how Dada used to do it.’ Comfort in the old makes us resist better, proven solutions, even when health is at stake.
Framing effect: “Eighty per cent of people feel safe’ in an advertisement sounds comforting. But ‘20 per cent feel unsafe’ raises alarm and discomforting, although both say the same thing. How facts are framed changes how we react. The same stat, different spins. The way info is presented shapes our reaction more than the info itself.
Dunning-Kruger effect: A village healer confidently treats heart diseases after reading one herbal remedy book. In reality, confidence does not equal competence. This bias lets the least informed speak the loudest.
Halo effect: A cricket legend or film star becomes a member of parliament. Why? Because people think charisma in sports equals skills in policy-making. One strength blinds us to other weaknesses—dangerous in leadership choices.
Optimism bias: Despite storm alerts, some fishers still set sail in Cox’s Bazar. ‘Others may get caught, but I’ll return safely.’ This false optimism leads to disasters that were entirely preventable.
Sunk cost fallacy: You are 45 minutes into a terrible movie at Cineplex. You stay, hoping it will get better because you paid for the ticket. We irrationally continue investments such as money, time and effort even when cutting losses is wiser.
Self-serving bias: A student places an examination and says, ‘I studied hard.’ If he fails the next one, ‘The teacher is unfair.’ We credit ourselves for success, but blame others or bad luck for failure. This protects ego but blocks growth.
Ìý
Bias in personal and daily life
COGNITIVE biases most profoundly affect our intimate relationship, where misperception carries deep emotional consequences. ‘Cognitive bias functions as an invisible wedge between hearts,’ explains Dr John Gottman. ‘Partners create fixed narratives about each other and then selectively perceive behaviours that confirm these stories while remaining blind to contradictory evidence.’ This creates ‘negative sentiment override’ where even neutral or positive actions are interpreted negatively.
Issues such as confirmation bias and fundamental attribution error polarise communities, turning simple disagreements into character attacks and fuelling social divisions. Negativity bias and framing effect are exploited in campaigns; voters remember scandals more than solutions and identical facts are twisted to support opposing agendas. Anchoring, bandwagon and sunk cost fallacies shape spending; people chase trends, stick to overpriced choices or keep investing in poor purchases to justify them.
Ìý
Artificial intelligence
AS ARTIFICIAL intelligence systems grow more sophisticated, they gain unprecedented ability to predict and influence human behaviour by exploiting cognitive biases. Unlike human manipulators, artificial intelligence can process billions of data points to identify exactly which psychological buttons to push.
‘We’re entering an era where machines can predict our choices better than we can, not because they’re conscious, but because they recognize patterns in our behaviour that remain invisible to us,’ explains Dr Michal Kosinski, computational psychologist at Stanford University.
Different mature apps use algorithms that learn exactly when and how to send notifications to trigger dopamine release and maximise user engagement. Streaming services precisely calculate your preference ‘fingerprint’ to recommend content that keeps people watching longer. Mobile games carefully calibrate reward schedules based on known psychological principles to maximise addiction potential. In the west, political campaigns use psychographic targeting to tailor messages based on personality profiles, showing different versions of the same candidate to different voters.
As artificial intelligence systems become more integrated into daily life, their ability to exploit cognitive biases only grows more sophisticated. Without awareness and protection, we risk creating a future where our thoughts and behaviours are invisibly shaped by algorithms optimised for goals that may not align with our well-being. If not addressed, eventually the entire civilisation could be compromised.
Our hardwired tribal instincts once helped us to survive by promoting group cooperation. Today, these same instincts fuel unprecedented polarisation. Research from Yale and Stanford reveals that in many democracies, discrimination based on political affiliation now exceeds bias based on race, religion or ethnicity. It is evident that in the west, they increasingly view political opponents as not only mistaken but also malevolent, a perspective that undermines the basis of democratic society.
Social media algorithms exploit this tribal psychology by promoting content that triggers outrage about opposing groups. This creates what researchers term ‘moral magnification,’ where we perceive opponents views as more extreme than they actually are. The consequences can be deadly. During the Covid pandemic, researchers at Johns Hopkins University found that the politicisation of public health measures led to thousands of preventable deaths as simple precautions became symbols of tribal identity rather than medical decisions. A study in the Proceedings of the National Academy of Sciences of US (2022) showed how tribal affiliations later turned simple public health measures into political battlegrounds, with attitudes towards preventive measures becoming more aligned with group identity than with scientific understanding.
We cannot delete cognitive biases. They are part of being human. But in a world where these biases are weaponised by algorithms, awareness becomes our shield not only to protect ourselves, but to preserve a humane, rational and harmonious society.
The first step to overcome cognitive bias is metacognitive awareness — thinking about how we think. This simple shift interrupts mental autopilot. Studies show that even brief exposure to bias education can noticeably improve decision-making. But awareness alone is not enough. The weakest version of an opposing idea should not be attacked; the strongest one should be built. This cultivates intellectual humility and dismantles confirmation bias. If something enrages someone online. It is better to ask if this is true. That pause blocks the viral spread of manipulation. Credible sources across ideological lines should be followed. Bias thrives in echo chambers; cognitive diversity builds resilience. For emotionally charged decisions or posts, waiting for a day helps. Time cools impulsive thinking and lets reason surface.
Push notifications should be silenced. Feed blockers should be used to reclaim attention. Screen-free zones — no phone during meals or before bed — be scheduled. Addictive contents should be unfollowed. Greyscale mode should be used to reduce screen dopamine, etc. Cognitive bias education should begin early. It is a survival skill in the digital age, not a psychology lesson. Platforms to prioritise truth over clicks should be pushed. Transparency and integrity must shape tech, not profit. Thinking ‘I could be wrong’ should be made a habit. Doubt is not weakness. It is an antidote to dogma. In an age of noise and manipulation, thinking about how we think is no longer optional; it is self-defence. If we do not take care of it now, we risk surrendering our minds to systems that profit from division, misinformation and emotional chaos. Without a collective commitment to bias-awareness, we may soon find ourselves in a future where truth is irrelevant, critical thinking extinct and civilisation distorted beyond recognition.
Ìý
Future of thinking
THE stakes could not be higher. As artificial intelligence systems grow more sophisticated in exploiting psychological vulnerabilities and as social fragmentation threatens democratic institutions, understanding cognitive bias has become essential for preserving both individual autonomy and social cohesion. Yet, there is reason for hopes. The fact that we can identify and develop countermeasures against our biases demonstrates our capacity for metacognition, our ability to think about our thinking. This uniquely human faculty allows us to recognise the limitations of minds and implement systems to compensate for them.
The ancient parable of blind men examining an elephant comes to mind. In today’s fractured information landscape, we are all those blind men, touching only parts of reality while mistaking our partial perceptions for complete understanding. But unlike those men in the parable, we have a choice. We can acknowledge our blindness and work together towards greater collective understanding. Or we can continue insisting that our limited perspective represents the entire elephant.
The future of freedom, communities and even civilisation hangs in the balance. The path to liberation begins the moment we realise that our own perspectives may also be biased. Questioning ourselves, trying to see through the eyes of the other and challenging the ‘curated reality’ shaped by technology is the true form of freedom in our time.
Ìý
AKM Shameem Akhter is a senior bureaucrat, with academic background in psychology and social psychology.