Authoritarian regimes have long relied on a dual strategy: glorifying nationalism, patriotism, and cultural identity on one hand, while simultaneously nurturing hatred, division, and fear on the other. In the digital age, this duality has found a powerful new arena – cyberspace. The modern manifestation of this phenomenon is what we now call the ‘troll army.’
Across the world, troll armies have emerged as one of the most effective instruments of political propaganda. Their objective is no longer limited to silencing dissent; rather, they seek to manufacture a climate of collective hostility where rational discourse is drowned out by orchestrated outrage. Social media platforms have become their primary battlegrounds – spaces where truth is buried under the weight of noise, and public opinion is shaped less by facts than by repetition, emotion, and manipulation.
China presents perhaps the most sophisticated example of this phenomenon. Over the past two decades, the Chinese state has built an extensive system of digital governance that tightly regulates online expression while simultaneously mobilising pro-government voices. At the centre of this ecosystem lies what is popularly known as the ‘50 Cent Party’ (Wumao Dang)- a loosely organised but highly effective network of commentators believed to number in the millions. Their role is not merely to counter criticism, but to actively steer conversations in favour of the state, amplify nationalist narratives, and discredit dissenting voices. Contrary to popular belief, Chinese digital censorship is not solely about deleting content. It is equally about distraction and redirection. Research has shown that a significant portion of pro-government online activity avoids direct confrontation with critics; instead, it floods platforms with positive or irrelevant content to dilute critical discussions. This strategy creates an illusion of consensus while marginalising opposing views.
The architecture of China’s internet further reinforces this system. The ‘Great Firewall’ effectively isolates domestic users from global platforms such as Facebook, Twitter (now X), and YouTube, replacing them with tightly controlled alternatives like WeChat, Weibo, and Douyin. These platforms operate under strict regulatory frameworks where content moderation is both automated and human-driven, often guided by opaque state directives. Sensitive topics – ranging from Xinjiang and Tibet to Hong Kong and Tiananmen – are heavily censored, and users who cross red lines risk account suspension, surveillance, or worse. In this environment, freedom of expression exists within clearly defined – and constantly shifting – boundaries. Citizens are not entirely silent; rather, they engage in a form of ‘coded speech,’ using metaphors, satire, and linguistic creativity to navigate censorship. However, such expressions are fragile and often short-lived, as authorities continuously adapt their monitoring mechanisms.
The role of troll-like behaviour becomes particularly visible in discussions surrounding sensitive issues, such as the treatment of Uyghur Muslims. Online narratives are swiftly policed, with coordinated responses that frame criticism as foreign interference or misinformation. This blending of state policy, digital control, and mass participation creates a powerful ecosystem where dissent is not only suppressed but socially delegitimised. The implications of such a system extend far beyond China. As governments and political actors worldwide observe and, in some cases, emulate aspects of this model, the line between organic public opinion and manufactured consensus becomes increasingly blurred. The rise of algorithm-driven engagement further exacerbates this trend, as outrage and polarisation are often rewarded with visibility.
For emerging democracies like Bangladesh, these developments offer both a warning and a mirror. While the political and technological contexts differ, the growing use of coordinated online harassment, disinformation, and partisan digital mobilisation is becoming increasingly evident. The temptation to weaponise nationalism and exploit digital platforms for short-term political gain can have long-term consequences for democratic institutions and social cohesion. The challenge, therefore, is not merely technological but ethical and political. Strengthening digital literacy, ensuring platform accountability, and fostering a culture of respectful disagreement are essential steps. Equally important is the commitment of political actors to refrain from exploiting digital spaces for division and manipulation.
Borrowing from the poetic sentiment, “Know your companion on the road (bondhu tomar pother shathike chine-e nio)”, we might say that in today’s digital world, recognising who shapes our online realities has never been more urgent. The companions we encounter in cyberspace are not always who they seem. And in the noise of orchestrated voices, the task of identifying truth becomes both more difficult – and more necessary – than ever.
The writer is an Associate Professor, Communication and Journalism, University of Chittagong
tob.news
