Skip to main content

Singapore Policy Journal

Topic / Media

Looking Beyond POFMA to Combat Fake News and Misinformation in Singapore

In the past two decades, online communication and social media platforms have become dominant in our lives, serving as tools to advance globalisation, connectivity and the freedom of speech. However, falsehoods have also proliferated, polluting our interconnected information ecosystem. Such falsehoods—whether maliciously spread or not—have served vested interests, leading to potential personal harm to individuals, polarised communities, and diminished trust towards experts, institutions, and technology.

Singapore is among a number of countries that have chosen punitive measures to mitigate online falsehoods, particularly through the introduction of the Protection from Online Falsehoods and Manipulation Act (POFMA). While POFMA can be a useful deterrent, it may not be effective on its own against the spread of falsehoods. This essay contends that there are non-punitive alternatives that the Singapore government should consider adopting such as the promotion of media and information literacy as well as experimental approaches such as gamification and “prebunking.”

POFMA and hard regulations

In heterogenous, multicultural Singapore, small incidents stemming from misinformation can light a fire that may affect trust towards public institutions and tear apart the fabric of its society. Hence, the Singapore Parliament set up a Select Committee on Deliberate Online Falsehoods in 2018 to assess this complex issue and recommend solutions to it.[i]It recommended a multi-pronged approach to combat deliberate online falsehoods, which focused on: (i) nurturing an informed public; (ii) reinforcing social cohesion and trust; (iii) promoting fact-checking; (iv) disrupting online falsehoods, with particular consideration to the role of technology companies; and (v) dealing with national security and sovereignty threats. The Committee highlighted the need for government intervention, through legislation, to “disrupt online falsehoods.”[ii]

This resulted in the May 2019 enactment of POFMA, which criminalises the deliberate spread of online falsehoods on online communication, social media, and private messaging platforms. Anyone caught contravening the law is subject to a fine not exceeding S$50,000 or a maximum five years of imprisonment, or both.[iii] Presently, the government may issue a Correction Direction, where the accused must publish a corrective notice to indicate that the published information is false without necessarily removing access to the falsehood. It can also issue a Stop Communication Direction, where the access to the information is disabled and that technology companies can be ordered to block accounts spreading the information, which is now considered a falsehood. Institutionally, the POFMA Office was established under the Infocomm Media Development Authority (IMDA) to administer the Act and provide support and technical advice to cabinet ministers.[iv]

Singapore is not alone in choosing the path of deterrence to combat falsehoods. A number of countries including Germany, France, and Thailand have introduced laws that grant authorities more executive power to deter fake news, allowing them to force social media platforms, websites, and publishers to remove false content.[v] At the time of writing, fact checking reporter Daniel Funke found at least 50 countries that took action against online falsehoods, ranging from hard regulations, such as internet shutdowns, to soft regulations, such as media literacy initiatives and task forces.[vi]While he found the effectiveness of such actions hard to assess, he stated that critics saw hard actions as potentially censoring citizens while soft actions were insufficiently meaningful. Given the complex nature of falsehoods, governments globally now have an unenviable task. It is now a question of how to regulate fake news, rather than choosing whether to regulate it at all.

POFMA has its fair share of compliments and criticisms, ranging from those that believe it has helped to keep falsehoods in check, reduce public panic, and protect public interest, to those still concerned with the continued rising risks of misinformation despite the enactment of the said legislation.[vii] There were also worries that POFMA might be abused to silence dissenters of the government. It is also not easy to measure the effectiveness of legislations like POFMA where objectives like combating misinformation can be intangible.

The COVID-19 pandemic has showcased both the strengths and limitations of POFMA. Dr Carol Soon, senior research fellow at the Institute of Policy Studies, noted that “instances of how POFMA was used during the COVID-19 outbreak demonstrates how it can be used to protect public interest, specifically, safeguarding public health and public safety.”[viii] However, fake news still looms large in Singapore’s information ecosystem, where a study by the National Centre of Infectious Disease found that six in ten Singaporeans received COVID-19-related falsehoods on social media.[ix] An alarming survey at the end of 2020 by the NTU Wee Kim Wee School of Communication and Information found almost one in four Singaporean residents believing a falsehood on DNA-altering COVID-19 vaccines.[x] This raises concerns whether deterrence is the best way forward in curbing the spread of online falsehoods.

Reviewing how information is consumed and disseminated

Before exploring alternatives to punitive measures in combating misinformation, one must first understand why people share falsehoods in the first place. Increasingly, experts observe that online platforms have become a place where rational persuasion has diminished while “affective persuasion,” where people receive and share messages with symbolic and emotional value,[xi] is pervasive,[xii] with collective action being driven by shared feelings and emotions.[xiii] Cognitive biases, affective polarisation, and ideological sorting are arguably more responsible for susceptibility to misinformation and misperceptions than the media and information environment that facilitates the sharing and exposure of misinformation.[xiv]

The advent of such platforms has allowed for direct access between content creators and users, with the former tapping into emotions to engage with the latter. This has remade the rules of the game for the post-truth era digital society where the platforms have amplified populism and stirred emotions.

Alternative approaches to combating misinformation

With a better understanding of how one’s emotions and biases can fuel the spread of online falsehoods, there is a need to explore alternative approaches beyond POFMA to combat misinformation and maintain the reservoir of public trust towards institutions. This section acknowledges the multi-pronged approach besides legislation advocated by the Parliamentary Select Committee in combating misinformation in Singapore, and adds to it by looking at how these approaches have materialised in Singapore and globally.

Although legislative action can be important in managing online falsehoods, POFMA targets statement makers but does not address user-specific issues, including those affecting news literacy, cognitive biases, and emotions. For example, while 76 percent of persons aged 60 and above use smartphones in Singapore,[xv] a 2018 survey showed about 40 percent of Singaporeans between 55 and 65 are unable to identify falsehoods.[xvi]

To address this, fact-checking tools such as those by Reuters and AFP (Agence France-Presse), as well as Black Dot Research’s COVIDWatch,[xvii] have been developed to debunk falsehoods immediately or as soon as they are discovered. Media literacy programmes have also been designed to raise awareness and educate the public at large on online falsehoods and the consumption and dissemination of information. Altogether, they are part of media and information literacy (MIL), now considered an essential life skill to critically assess the accuracy, soundness, and sufficiency of information in order to navigate their consumption, production, discovery, evaluation, and sharing of information.[xviii] [xix]

More attention should be focused on MIL as the knowledge and skills required to identify credible news sources, such as fact-checking, verifying, and correcting, are similar to those needed to identify and perhaps reject misinformation.[xx] More organisations are promoting such initiatives, such as the Google News Initiative and the International Fact-Checking Network. Such MIL efforts have been established in Singapore, in the form of educational resources and awareness campaigns such as “s.u.r.e. (Source, Understand, Research, and Evaluate)” and “Sure Anot.”[xxi] The Media Literacy Council, a group of public and private stakeholders focused on media literacy and cyber wellness, has also introduced a “Better Internet Campaign”[xxii] that has disseminated e-resources and launched initiatives including youth-led community projects to promote online safety, responsibility, and civility, as well as develop abilities to discern online content.[xxiii]

While MIL efforts would help build up societal capacity for dealing with the information at hand, it is considered more long-term in the objectives they aim to achieve and the learning to share and embed among citizens in such education and awareness-related programmes. In short, they take time, which we may not be able to afford. Such efforts are further limited by the funding and capabilities of non-governmental organisations to drive MIL initiatives, which do not have profit motives to attain the buy-in of private stakeholders. As such, further resources need to be allocated to help scale MIL initiatives upwards and experimenting with different approaches such as gamification and “prebunking.”

Gamification, or the addition of game mechanics and features into non-game environments, would assist in making MIL more interactive and easily understood by laypeople. Prebunking, or the idea of debunking misinformation before it is presented, is grounded on inoculation theory—people’s resilience against online falsehoods can be built through pre-emptive exposure towards weakened persuasive arguments.[xxiv] A study trialled this method with vaccine-related conspiracy theories, with results demonstrating that it is possible to inoculate against the harmful effects of such conspiracy theories, but once established, will be resistant to correction.[xxv]

Further, there are studies that show gamification and prebunking being able to reduce susceptibility towards misinformation across cultures,[xxvi] which would be pertinent to a multicultural society such as Singapore. There are also new methods being trialled such as observational correction, whereby people change their attitudes upon seeing another person being corrected on social media platforms.[xxvii] [xxviii] While the approaches described remain experimental, they have shown potential for rooting out misinformation through people’s hearts and minds, rather than by hardening their worldviews.

Experimental approaches have begun to take root in Singapore such as an affordable and gamified fake news toolkit aimed at children aged 8–12 years old.[xxix] These innovative steps against misinformation are crucial to diversify the tools available to the Singaporean government and public to help the average Singaporean not only debunk but also “prebunk” falsehoods, anticipate and prepare against false content, and promote cyber community wellbeing.

Building institutions and technology we can trust

There is no doubt that more information needs to be gathered to improve upon current efforts to combat online falsehoods. Hard regulations like POFMA remain the norm and an important tool for policymakers, but they should be mindful of their intended, and especially unintended, consequences that need to be studied further. Alternative approaches have been looked into before[xxx] and are starting to take shape in Singapore. Such approaches will need to be evaluated further, but should also be afforded the ability, resources, and space to be tested, for the purposes of diversifying and enhancing the capabilities of Singaporeans to detect and identify falsehoods. This would better ensure that we can build back better the foundations and relationships of trust between people, institutions, and technology, and not fracture it further in this post-truth era.


[i] Parliament of Singapore. (2018). Select Committee on Deliberate Online Falsehoods—Causes, Consequences and Countermeasures. https://www.parliament.gov.sg/sconlinefalsehoods.

[ii] Parliament of Singapore. (2018). Executive Summary: Report of the Select Committee on Deliberate Online Falsehoods. https://www.parliament.gov.sg/docs/default-source/Press-Releases/executive-summary—report-of-the-select-committee-on-deliberate-online-falsehoods.pdf.

[iii] Protection from Online Falsehoods and Manipulation Act 2019. https://sso.agc.gov.sg/Acts-Supp/18-.

[iv] Hussain, A. (2019). IMDA to set up POFMA office to administer fake news law: S Iswaran. https://sg.news.yahoo.com/imda-to-set-up-pofma-office-to-administer-fake-news-law-s-iswaran-114357552.html.

[v] A guide to anti-misinformation actions around the world. (n.d.). Poynter. https://www.poynter.org/ifcn/anti-misinformation-actions/.

[vi] Funke, D. (2021). Global responses to misinformation and populism. In H. Tumber & S. Waisbord (Eds.), The Routledge Companion to Media Disinformation and Populism (1st ed., pp. 449–458). Routledge. https://doi.org/10.4324/9781003004431-47.

[vii] Mahmud, A. H. (2020). IN FOCUS: Has POFMA been effective? A look at the fake news law, 1 year since it kicked in. Channel News Asia; Channel News Asia. https://www.channelnewsasia.com/news/singapore/singapore-pofma-fake-news-law-1-year-kicked-in-13163404.

[viii] Ibid.

[ix] Chew, H. M. (2020). 6 in 10 people in Singapore have received fake COVID-19 news, likely on social media: Survey. Channel News Asia; Channel News Asia. https://www.channelnewsasia.com/news/singapore/fake-covid-19-news-study-ncid-messaging-platforms-whatsapp-12756084.

[x] Ang, Q. (2020, December 25). Nearly 1 in 4 here believes false claim on vaccine: Poll. The Straits Times; The Straits Times. https://www.straitstimes.com/singapore/nearly-1-in-4-here-believes-false-claim-on-vaccine-poll.

[xi] Arias-Maldonado, M. (2017). Rethinking Populism in the Digital Age: Social Networks, Political Affects and Post-Truth Democracies. Congreso AECPA, Santiago de Compostela, 20–22 September.

[xii] Flew, T., & Iosifidis, P. (2020). Populism, globalisation and social media. International Communication Gazette, 82(1), 7–25. https://doi.org/10.1177/1748048519880721.

[xiii] Sunstein, C. R. (2018). #Republic: Divided Democracy in the Age of Social Media. Princeton University Press. https://doi.org/10.2307/j.ctv8xnhtd.

[xiv] Dunaway, J. (2021). Polarisation and misinformation. In H. Tumber & S. Waisbord (Eds.), The Routledge Companion to Media Disinformation and Populism (1st ed., pp. 131–141). Routledge. https://doi.org/10.4324/9781003004431-15.

[xv] Annual Survey on Infocomm Usage in Households and by Individuals for 2019. (2019). Infocomm Media Development Authority (IMDA). https://www.imda.gov.sg/-/media/Imda/Files/Infocomm-Media-Landscape/Research-and-Statistics/Survey-Report/2019-HH-Public-Report_09032020.pdf?la=en.

[xvi] IPSOS Public Affairs. (2018). Trust and Confidence in News Sources. IPSOS. https://www.ipsos.com/sites/default/files/ct/news/documents/2018-10/ipsos_report_fake_news_updated_3_oct_2018.pdf.

[xvii] #COVIDWatch. (n.d.). Black Dot Research. https://blackdotresearch.sg/covid-watch/.

[xviii] Britt, M. A., Rouet, J.-F., Blaum, D., & Millis, K. (2019). A reasoned approach to dealing with fake news. Policy Insights from the Behavioral and Brain Sciences, 6(1), 94–101. https://doi.org/10.1177/2372732218814855.

[xix] Ireton, C., Posetti, J., & UNESCO. (2018). Journalism, ‘fake news’ et disinformation: Handbook for journalism education and training. http://unesdoc.unesco.org/images/0026/002655/265552E.pdf.

[xx] Tully, M. (2021). News literacy and misinformation. In H. Tumber & S. Waisbord (Eds.), The Routledge Companion to Media Disinformation and Populism (1st ed., pp. 480–488). Routledge. https://doi.org/10.4324/9781003004431-50.

[xxi] How seniors can fight fake news (English). (n.d.). https://sure.nlb.gov.sg/blog/seniors/SN0017.

[xxii] MLC | Better Internet Campaign 2020. (2020). Media Literacy Council. https://www.betterinternet.sg/Campaign-2020.

[xxiii] MLC | Youth-led community projects. (2020). Media Literacy Council. https://www.betterinternet.sg/Campaign-2020/Youth-led-community-projects/Youth-led-Community-Projects.

[xxiv] Roozenbeek, J., Basol, M., & van der Linden, S. (2021, February 22). A new way to inoculate people against misinformation. Behavioral Scientist. https://behavioralscientist.org/a-new-way-to-inoculate-people-against-misinformation/.

[xxv] Jolley, D., & Douglas, K. M. (2017). Prevention is better than cure: Addressing anti-vaccine conspiracy theories: JOLLEY and DOUGLAS. Journal of Applied Social Psychology, 47(8), 459–469. https://doi.org/10.1111/jasp.12453.

[xxvi] Roozenbeek, J., Linden, S. van der, & Nygren, T. (2020). Prebunking interventions based on “inoculation” theory can reduce susceptibility to misinformation across cultures. Harvard Kennedy School Misinformation Review, 1(2). https://doi.org/10.37016//mr-2020-008.

[xxvii] Bode, L., & Vraga, E. K. (2021). People-powered correction. In H. Tumber & S. Waisbord (Eds.), The Routledge Companion to Media Disinformation and Populism (1st ed., pp. 498–506). Routledge. https://doi.org/10.4324/9781003004431-52.

[xxviii] Bode, L., & Vraga, E. K. (2021). Correction experiences on social media during covid-19. Social Media + Society, 7(2), 205630512110088. https://doi.org/10.1177/20563051211008829.

[xxix] Fake News Toolkit. (n.d.). Eyeyah!; Eyeyah! https://eyeyah.com/product/fake-news-toolkit/.

[xxx] Soon, C., & Goh, S. (2018). Fake News, False Information and More: Countering Human Biases. IPS Working Papers No. 31. Institute of Policy Studies.