Votes in the Fog: How Misinformation Will Shape the Next General Election
Misinformation is less about persuasion and more about governing uncertainty.
Introduction: An Election That Begins Before the Ballot
The next general election will not begin on polling day. It’s already started with shared messages, short videos, out-of-context images, secret social media profiles, and private chats where gossip spreads quickly. Community of chat bots, pull of social media experts (manipulators) are in position to guide the narrative.
In this environment, elections are no longer shaped only by manifestos, candidates, or rallies. Uncertainty shapes them: people are uncertain about what is true, whom they can trust, and whether political participation itself is meaningful or risky. And in the digital living, misinformation does not need to persuade everyone. It only needs to confuse, exhaust, and polarize.
Beyond Lies: Understanding the Information Disorder
Public discussion often collapses everything into one phrase: fake news.
But the digital ecology surrounding elections is more complex.
Misinformation is untrue or confusing information shared without meaning to trick people, such as “just in case” rumors about voting.
Disinformation is false information created to trick people, such as made-up stories, altered images, and stories that try to make elections seem unfair.
Malinformation consists of information that may be true but is used maliciously—old videos resurfaced, private messages leaked, or selective facts presented to damage credibility.
Of these, malinformation is often the most powerful. Because it is anchored in reality, it is harder to dismiss and easier to weaponize.
Based on a close look at social media before the election, it seems misinformation isn’t just spreading as single, untrue statements. Instead, emotionally saturated narratives—grief, anger, moral outrage, national pride—consistently attract higher engagement than policy-oriented content. This behavior happens in all kinds of political groups, showing that emotions, not just facts, often build political trust online.
A Different Way of Looking at Misinformation
Much existing research approaches misinformation as a problem of media effects: How does false content spread? Who believes it? Does it change voting behavior?
These questions matter—but they are incomplete. The resolution is also incomplete; media literacy alone is insufficient.
They believe the public can have open discussions, trust organizations, and get involved in politics without worrying about what might happen. In many societies, including Bangladesh, that assumption no longer holds. The prolonged experience of surveillance and self-censorship contributed much. So is the newfound freedom after the July uprising 2024. Though the general condition of security is on a downward slope.
An anthropological perspective begins elsewhere: with how people experience digital political life.
From this viewpoint, misinformation is not simply a failure of facts. It is a political condition—one that reshapes trust, fear, moral judgment, and decisions about visibility. People do not merely consume political content; they calculate risk, anticipate backlash, and learn when silence may be safer than speech.
Seen this way, misinformation is less about persuasion and more about governing uncertainty.
How Misinformation Is Likely to Shape the Election
Eroding Trust Without Offering Alternatives
Much election-related misinformation does not instruct people what to believe. Instead, it teaches them that nothing is reliable—not institutions, not procedures, not even opposition claims.
This erosion of trust is subtle. It produces cynicism rather than outrage, withdrawal rather than protest. Voters may still participate, but without conviction. The election proceeds, yet legitimacy quietly thins.
Moral Polarization Over Policy Debate
Digital misinformation increasingly operates through moral framing rather than factual argument:
Who is righteous?
Who is corrupt?
Who threatens religion, culture, or the nation?
In this moral economy, evidence matters less than alignment. Content spreads not because it is accurate, but because it feels morally correct.
Posts that focus on clear emotions and strong morals get shared more often than those about legal matters, policies, or official processes. Rational discourse may earn respect, but emotional discourse earns circulation. In recent times and during the July uprising, social media influencers have been seen sharing specific “facts” to support certain ideas, which in turn leads to them controlling large groups of people and inciting them to commit crimes.
Malinformation and the Politics of Moral Canonization
Recent patterns of online political communication reveal how actual events—political violence, public mourning, national commemorations—quickly transform into moral symbols.
Once canonized digitally, such events become resistant to contextual correction. Competing actors mobilize the same incident to assert legitimacy, assign blame, or demand loyalty. In these moments, misinformation does not depend on fabrication, but on selective amplification of truth.
This is where malinformation becomes most potent: it weaponizes reality itself.
Platform-Specific Amplification
Different platforms intensify different dynamics:
Facebook circulates partisan narratives and symbolic legitimacy through reactions and shares.
YouTube hosts long-form “analysis” that mimics authority and expertise.
Short-video platforms compress politics into emotion and spectacle.
Messaging apps transform rumor into trusted knowledge within closed networks.
Once political content enters private channels, correction becomes nearly impossible.
The Power of Images Over Arguments
Visuals increasingly do political work that words no longer need to do. Crowd shots, coffins, crying parents, drone footage of rallies, stylized posters—these images operate as proof without explanation. In a high-tension election, atmosphere becomes evidence. Emotional plausibility replaces verification.
The Weaponization of Silence
Not all manipulation is loud. Ethnographic observation also reveals a growing pattern of political withdrawal. Many users continue to consume political content but avoid commenting, sharing, or expressing disagreement. This silence is not apathy; it reflects fear of backlash, social labeling, and moral exposure. What remains unseen—stories not amplified, voices algorithmically buried, topics people avoid—also shapes electoral imagination. Silence itself becomes politically meaningful.
Why Fact-Checking Will Not Be Enough
Fact-checking assumes misinformation is primarily an informational problem.But people rarely share misleading content because they lack facts. They share because it fits their fears, loyalties, and moral intuitions.
Observed pre-election activity shows that measured, policy-based communication consistently travels more slowly than emotionally charged narratives. This asymmetry helps explain why fact-checking initiatives struggle to interrupt misinformation cycles: they operate in an environment where speed, emotion, and moral alignment outweigh verification.
This is not simply an information crisis. It is a crisis of political belonging.
What Is Ultimately at Stake
The danger is not only distorted electoral outcomes.
It is the normalization of a political environment where:
truth feels optional
participation feels risky
cynicism feels rational
In such conditions, democracy survives procedurally while hollowing out substantively.
Conclusion: Elections in the Age of Uncertainty
If earlier elections were about turnout and ballots, the coming one may be about emotional governance and perception management. The most effective manipulation will not tell people what to think—but will teach them when not to speak, when not to trust, and when to withdraw.
When people stop believing that truth matters, falsehood no longer needs to hide.
Author’s note:
This essay draws on insights from ongoing digital ethnographic observation of pre-election social media activity in Bangladesh conducted in a teaching–research setting. Specific examples have been anonymized and generalized.
Dr. Moiyen Zalal Chowdhury