Skip to main content

Singapore Policy Journal

Topic / Science, Technology and Data

[Reading Group] Collective Summary #3: Regulating Digital Technology — Challenges & Trade-offs

The following is the third of four collective summaries published by the Singapore Policy Journal’s reading group on Digital Technology. Each collective summary is a product of the topics discussed and the various research directions of the members of the reading group. The reading group comprises various individuals from multiple backgrounds, providing a multidisciplinary approach to digital technology.


The need to keep the COVID-19 pandemic under control has driven governments all around the world to strengthen their infectious disease surveillance capabilities by adopting new big-data driven practices. Singapore is no exception and the government’s rollout of SafeEntry and TraceTogether has introduced a new wave of discourse on data security, privacy, and accountability. These conversations are also being fueled with calls for stricter regulations on Big Tech as consumers come to grips with the industry’s concentration of power and their ability to influence human behavior.[i] In light of these developments, how should we think about the challenges and trade-offs that come with regulating digital technology?

The Singapore Policy Journal’s reading group invited Simon Chesterman and Roland Turner to share their insights in data protection, privacy, and information security. Simon Chesterman is the dean and provost chair’s professor of the National University of Singapore’s Faculty of Law. He is a recognized authority on international law and has written extensively on global governance, the changing role of intelligence agencies, and the emerging role of artificial intelligence and big data. Roland Turner is a renowned privacy expert and is currently serving as the chief privacy officer for TrustSphere. He plays an active role in numerous technical communities in Singapore, being a HackerspaceSG founding member, FOSSASIA organizer, and executive committee member of the Internet Society’s Singapore Chapter.


“Our focus has shifted from privacy to data protection.”

The rollout of SafeEntry and TraceTogether during the pandemic has allowed the public to understand and experience the societal benefits of harnessing personal data on a population-wide scale. In this case, these technology initiatives allowed Singapore to contain the spread of COVID-19 and return to a relative state of economic and social normalcy. The benefits of collecting personal data in ensuring national security were also felt when the Internal Security Department identified and detained a Singaporean youth in January 2021 for planning terrorist attacks on two mosques,[ii] and another youth in March of the same year for planning an attack on a synagogue.[iii]

During his opening remarks, Chesterman noted that citizens do not want absolute privacy. They understand that the collection of personal data, to some extent, is necessary for the public good. Most citizens are not willing to cede such benefits for the sake of complete individual privacy. What they want, instead, is to be able to safeguard their data and have control over how their personal information is being used.

In response, the Singapore government has shifted to an accountability framework, recognizing that giving up personal data is almost inevitable in today’s world. Chesterman explains that by doing so, the onus is placed on the organization to protect the data by putting security measures in place and to explain to users how their data is being used, rather than on the user to consent and accept the risks incurred by sharing their personal data. The recent changes to the Personal Data Protection Act (PDPA) illustrates this trend away from consent towards a renewed emphasis on risk assessment and consumer protection. Notably, the revised PDPA expands the list of legitimate interests for which companies do not need to seek consent, or are deemed to have already obtained consent, to obtain and use personal data.[iv] However, while affording organizations with more freedom to use personal data, the enhanced PDPA also increases pressure on organizations to strengthen organizational accountability and consumer protection by criminalizing the mishandling of personal data by individuals and increasing financial penalties for companies found guilty of a data breach.[v] Above all, this shift a welcomed one, given that the problem with consent today is that it is clearly artificial; there is a choice to abstain from using these technologies, but in reality, as discussed by the reading group, there is a significant social cost to those who choose to opt out of them.


“Giving users control over their data is important in building buy-in and trust.”

Even though there is a de-emphasis on privacy, the fact that their data is being shared or made public does not remove the individual’s interest in having control over it. The contrast in public attitudes towards SafeEntry and TraceTogether is a testament to this.

Even though SafeEntry is arguably more intrusive than TraceTogether, citizens were more wary of the latter when it first rolled out. Turner reasons that this is because the user is given significantly more agency in SafeEntry. Specifically, users need to scan a QR code every time they entered a new location, allowing them to reassess their decision to share their location information. On the other hand, TraceTogether runs in the background, and users are not privy to the devices that TraceTogether is recognizing. They also do not have control over how TraceTogether logs their data beyond choosing to abstain from using the application or token entirely.

As we shift from focusing on privacy to data protection and control, more resources should be invested in understanding how we should view control and consent. For one, instead of viewing the achievement of privacy in all-or-nothing terms, the reading group found it more useful to reframe the concept of privacy as concentric circles, whereby each circle represents a level of personal data the individual is willing to share, depending on whom the data is being shared with. In this way, the individual is afforded greater control in choosing what data to give away, and what data to keep private.


“The point is to not pick a standard of liberty and claim it to be universal.”

Picking a standard of liberty and limiting the use of technologies that infringe the standard not only discourages innovation, but it also permanently closes the door on future opportunities to use these technologies for social good. Additionally, not all risks can be managed in this way.

Instead, Turner in his opening remarks recommended policymakers to conduct impact assessments that evaluate the benefits and risks of a technology they wish to introduce within their unique context. Impact assessments are crucial as they help policymakers engage in a real cost-benefit analysis that does not rely on vague concepts such as liberty or security. The reading group subsequently concurred with Turner during the discussion session, adding that the assessment of security risks, along with societal and ethical considerations, need to be institutionalized. At the moment, government planning documents, such as the Digital Government Blueprint, do not explicitly mention a framework or a checklist for policymakers to identify areas of risk when designing digital government services. The Ethical OS Toolkit is an example of how such a framework can look like.[vi]

The publishing of these impact assessments is also important, as it gives citizens true agency to make informed decisions with regard to the use of their personal data. Ultimately, a transparent approach allows more people to understand how their personal information is being used and protected, increasing the likelihood that they will trust the technology, and by extension, retain their trust in government.


“Regulating Big Tech poses a formidable challenge for Singapore.”

During the discussion session, the reading group also raised concerns over Singapore’s agency and influence vis-à-vis Big Tech and the challenges faced in regulating them. The ubiquitous nature of Big Tech gives them access to an “uncomfortably large amount of [our] information,” giving these companies the power to make decisions for people all around the world, regardless of their political systems and decision-making processes.

Though Singapore is able to change domestic laws and regulations, it risks pushing these Big Tech firms out of the country completely, which would be at an overall detriment to citizens whose lives have become deeply intertwined with these services. Australia’s attempt at regulating Facebook and the company’s show of force in banning Australian news organizations from the platform is evidence of the formidable challenge governments face in reining in these tech giants.[vii]

Outside of domestic policy, the reading group emphasized that Singapore lacks the geopolitical and economic weight to shape digital standards on the international stage, even as other countries continue to look to Singapore for guidance on digital policy, regulation, and governance. Stricter regulations on Big Tech might also come into conflict with Singapore’s ambitions to become Asia’s Silicon Valley.[viii] As other countries begin the big breakup conversation with Big Tech, Singapore’s role in setting global norms and its stance on regulating Big Tech will require greater clarity.


“We need to continue closing the technology knowledge gap through policy.”

Another reason why consent has been made almost meaningless in the digital realm is that users do not have the technical knowledge to comprehend what consent exactly entails. As observed by the reading group, most users consent to the terms and conditions of service without reading or trying to understand them. Of course, this is without a doubt a function of its long length, but it is also a function of the technical jargon that is embedded within these contracts.

Ultimately, without sufficient technical knowledge, the public is unable to fully assess the conditions, along with the trade-offs, that come when a new piece of technology or digital service is introduced. The publication of impact assessments, although a step towards greater transparency and accountability, would not help to inform or drive public discourse on these trade-offs if the majority of readers are unable to understand them. Independent experts can help to explain the technology and verify its security measures but building trust in these third-party public technologists will be a long and arduous process. In the meantime, technical knowledge and digital literacy need to be continually nurtured in civil society to ensure that we are empowered to “watch the watchmen.”


[i] Lieber, Chavie. “Tech Companies Use ‘Persuasive Design’ to Get Us Hooked. Psychologists Say It’s Unethical.” Vox. Vox, August 8, 2018.

[ii] Lim, Min Zhang. “16-Year-Old Singaporean Detained under ISA for Planning Terrorist Attacks on Two Mosques.” The Straits Times. SPH Digital News, January 28, 2021.

[iii] “S’pore Youth ISA Detention: How a Former NSF Was Radicalised, Planned to Attack Jews.” The Straits Times. SPH Digital News, March 13, 2021.

[iv] Wong, Lester. “On Protecting Data While Enabling Innovation: 6 Highlights from MPs’ Rigorous Debate on PDPA Amendments.” The Straits Times. SPH Digital News, November 2, 2020.

[v] Wong, Lester. “Parliament: Heavier Fines for Data Breaches, More Support for Legitimate Business Uses of Data under Amended PDPA.” The Straits Times. SPH Digital News, November 2, 2020.

[vi] Ethical OS, August 8, 2018.

[vii] Paul, Kari. “What Facebook’s Australia News Ban Could Mean for Its Future in the US.” The Guardian. Guardian News and Media, February 27, 2021.

[viii] Shu, Catherine. “Singapore Is Poised to Become Asia’s Silicon Valley.” TechCrunch. TechCrunch, December 14, 2020.