Skip to main content

Kennedy School Review

Topic / Science, Technology and Data

A European Perspective on the Protection of Personal Data in Cyberspace

BY NIKOLAS OTT AND HUGO ZYLBERBERG

Practical yet effective digital data regulations are an enormous policy challenge. Both in the United States and in Europe, businesses, privacy-advocacy groups, and government all have competing interests, and they are struggling to find a workable solution. Meanwhile, machines are tracking their users in an ever-increasing number of ways. All-day activity trackers are widely available, and the market for connected trackers is growing significantly. While most of these devices are used to record such ordinary activities as the number of steps taken or the time spent biking, some are used in more adventurous ways, such as keeping track of sexual activity.[i] We argue that as the quantity and sensitivity of data increase, users should have more information about “ownership” of the data in question. Indeed, if users generate the data and companies provide the tools to capture and analyze it, who should own that data?

Market economies rely on the principle that once a good is created, its owner can trade it based on its monetary value. Following this logic, data that users help create should be, at least in some sense, their own. But how can they evaluate its value independent of the processing done by the companies that help them record it? The Boston Consulting Group predicts that digital identities will become a major source of revenue for private companies,[ii] while IBM, a technology and consulting corporation, believes that “data is valuable but most [users] don’t know how to harvest [their] own personal data or how to exchange it for something that really matters.”[iii] While it is true the companies which help record that data provide free services in exchange, the rise of user-generated content prompts a discussion in how we understand the “our” in “our data.”

This data-for-service argument puts the user in a position of customer and does not consider the fact that, in this new data-driven economy, the user is responsible for much of the value creation. In that sense, to be fairly compensated, users should be considered not only as customers but also as producers. We first examine several ideas that rely on the notion of “users as producers” before outlining how the European General Data Protection Regulation implements this notion into cyberspace. More specifically, we believe that the European Union’s (EU) efforts reflect a feasible interpretation of data protection in the twenty-first century. Finally, in an age when the Internet of Things (where more objects get connected to the Internet) increases the pervasiveness of data collection and Big Data analytics make data processing ever more valuable, we argue that the data-protection approach favors user-centric models.

 

Data Exchanges Are Not One-Way Commercial Exchanges

John Lanier, an American computer scientist and digital-media pioneer, was among the first individuals who started claiming that the creation of data has value. He argues that consumer-generated data which companies can convert into profits should be considered as a kind of labor (some call this “digital labor”)[iv] and thus compensated (e.g., through “nanopayments”).[v] We first consider these ideas of nanopayments and digital labor to understand how exactly they challenge the common perspective on data possession.

Lanier argues for some kind of monetary compensation every time users share their data with a company.[vi] Why indeed should users not at least know the value they help generate to be able to decide whether the respective benefit is a fair compensation for the use of their personal data? Indeed, since that data is valuable in itself, a very small amount of money would help people make trade-offs—is that data more or less valuable than the amount of money users are being offered for it? This compensation would potentially incentivize companies to be more frugal in the way they collect and process personal data, since they would then have to pay for collecting it. Unfortunately, this idea is excruciatingly difficult to operationalize.

Another notion that has been proposed is that of “digital labor.”[vii] The idea is that the user should be considered to be a worker for the company collecting his or her data and thus compensated. While this might appear to be a radical idea, it is easy to see how the current systems and online services we use might be understood in such terms. Instead of understanding Facebook as a company providing a free service, we should think of it as a digital factory that aggregates users’ data to sell it to advertisers, paying its users by allowing them to use Facebook free of charge. In this model, the value that Facebook creates originates in the data that people create. The major difference is therefore that, in this perspective, data exchanges are not merely commercial exchanges but rather labor exchanges.

There is no doubt that consumers receive compensation for their data by receiving improved services and personalized features. The fact that Google knows more about users’ preferences enables it to provide search results that are more accurate and ads that are more relevant. However, the term compensation here can take two distinct meanings. If we consider that users buy a service with data that they own, we frame this as a commercial relation and therefore see the user as a consumer. However, if we consider that users have worked to create that data in the first place, then we see the user as a producer who deserves to be protected from exploitation and expropriation. While the “user as a consumer” model implies that one might have to weigh the value of the data that’s being given against the service that is being provided, the “user as a producer” model considers data in another way: since users have spent some time creating data and generating value for the data-collecting company, they deserve to be fairly compensated.

 

Leveling the Playing Field between Users as Producers and Companies through Data Protection

Proper compensations, however, are scarce. As evidenced by a recent study by the Annenberg School for Communication, Americans do not feel that they are fairly compensated.[viii] The study writes “marketers are misrepresenting a large majority of Americans by claiming that Americans give out information about themselves as a trade-off for benefits they receive. To the contrary, the survey reveals most Americans do not believe that ‘data for discounts’ is a square deal.”

Strengthening users’ rights in their relationships with commercial providers might help build trust in the digital economy. Digital innovations are opening up opportunities to improve our lives, but such features tend to include opaque data analytics and business models. In that sense, giving more rights to users would mean asking for more transparency about the algorithms that process our data, the companies those algorithms are shared with or sold to, and the nature of that data. There will be no meaningful consent given as long as users do not understand how their data is used and do not have the means to seek redress when they feel that such uses are incompatible with what they consented to. There is a trade-off between the protection of users’ data and the economic value created by the companies processing it. With our current notion of consent, many of us consent to give away our data when we decide to use modern technology at all, which makes protection hard to envisage.

Surveillance is the business model of the Internet, as security guru Bruce Schneier says (Source: Flickr, Finishing Schol)

Moreover, this trade-off between protection and economic value is becoming increasingly hard to make as the Internet of Things and Big Data analytics become commonplace. The Internet of Things is based the idea that the objects could be more valuable if connected to the Internet. This brings connected trackers into every aspect of our lives, making the issue of protecting our data more palpable. The phenomenon of Big Data is the ability to create value out of the processing of large datasets that could not have been extracted out of smaller ones; however, ex ante regulation of processed personal data is a challenge since its value is unclear prior to the actual aggregation of the datasets.

Combined, these two phenomena make it increasingly difficult for users to answer the question of whether or not they should share a certain piece of data and with whom. To give users more information about the choice they are making, and enable them to make this choice in the first place, the EU has been working on an approach to personal data. After 20 years of regulative legal developments, that approach will be updated in spring 2016 with the General Data Protection Regulation.[ix]

 

The European Union’s Push toward a Modern Data Regulation Policy

The EU has long been a key player in the international field of data protection, with its first Data Protection Directive in 1995 “on the protection of individuals with regard to the processing of personal data and on the free movement of such data”[x] stating that “data which are capable by their nature of infringing fundamental freedoms or privacy should not be processed unless the data subject gives his explicit consent.”[xi] In 2012, the European Commission introduced a reform plan that would “give citizens back control over of their personal data, and (. . .) simplify the regulatory environment for business.”[xii] It became the new General Data Protection Regulation. This new General Data Protection Regulation is built around several principles,[xiii] including:

1. a strengthening of existing rights and the acquisition of new ones

2. increased importance given to national Data Protection Authorities who become the “one-stop-shop” for businesses

The first principle above is focused on the rights of citizens in cyberspace. Two examples of such rights are the so-called right to be forgotten, which the EU has been promoting since the European Court of Justice decision in May 2014,[xiv] and the right to portability, which is supposed to reduce the costs for users switching between different service providers. If users feel such rights are being violated, they can seek redress through their national Data Protection Authority.

Artists have been showcasing the physical underpinnings of the Internet, here old undersea cables (Photo Credit: Rob Koopman, Flickr)

Those principles can be interpreted as a dual reality, where data is seen both as an economic good (the regulation deals with many commercial matters) and also as something that citizens should have the right to protect. This dual nature of data makes it hard to answer our first question about what the “our” means in “our data.” Instead of choosing one approach over the other, regulators should strive to have users express their preferences and have those preferences respected throughout the lifecycle of data. This is, in essence, what the European approach of Data Protection tries to do by putting users’ preferences at the center of the regulation.

 

The Economic and Moral Argument in Favor of User-Centric Systems

We need to redefine our understanding of ownership of digital data. Such an understanding will better equip users to solve the sharing trade-off (to share or not to share). There are many technological projects aiming at improving the understanding of the life cycle of data. In some cases, Data Protection Authorities or technological companies have developed tools to be able to watch basic data transactions between websites. However, how can users know which of their data is collected and used? This seems like fundamental information needed to decide to share data with a third party. Researchers at the Computer Science Department at Columbia University are working on a way to reverse engineer data collection on websites to increase the transparency of the current methods employed.[xv] It would provide users with more knowledge and therefore empower them as consumers. The argument of empowering consumers in the digital age follows the Anglo-Saxon notion that the market has the potential to solve privacy problems if power is evenly distributed between businesses and consumers.[xvi]

Creating user-centric systems will also level the playing field between companies on the international scene. User-centric views enable us to conceive of systems where local providers can leverage their geographical proximity and their knowledge of local cultural values into economic values, instead of normalizing consumers around a global average. This approach gives users the opportunity to decide how their personal data should be shared. When they are producing valuable data, they can protect it and share it only when transactions are deemed fair; when they are only producing data as a byproduct of a valuable service, they are free to give that data to private companies who are then able to monetize it.

The growth in user-generated value in the context of increased connectedness (the Internet of Things) and value extraction from large datasets (Big Data) asks for a fundamental shift in the way we understand data exchanges. They are not merely commercial transactions where users make an economic trade-off but rather labor exchanges where citizens deserve to be protected for the labor they produce. In this perspective, the European Data Protection approach tries to provide an adequate level of data protection while preserving economic opportunities.

“The future is already here, it is just unevenly distributed.” A quote from Neuromancer author William Gibson (Photo Credit: Joe Pemberton, Flickr)

 

Conclusion

We believe that the question of what data ownership means is becoming a central issue for Europe’s economic competitiveness. In fact, cyberspace calls for a revision of many definitions that had been elaborated in an offline world. The data protection approach developed by the EU addresses three main challenges of this endeavor. First, it creates new relevant concepts for cyberspace. Second, it leaves an appropriate measure of freedom for businesses to innovate. Third, it provides users with a practical application of their human rights in cyberspace.

Beyond mere producers or consumers, users are first and foremost citizens. As such, these debates connect to fundamental aspects of democracy—they challenge the concept of identity and how we define ourselves in relation to others and to society as a whole. If our sense of identity shapes how we behave offline and online, it also seems that the way we perceive our actions in turn shapes our sense of identity. While the digital transition continues, this shaping of identities will be affected by the relationship between states, private actors, and citizens. As a democratic endeavor, data protection seeks to put citizens at the center of those relations to express their preferences and disagreements, as well as keep some measure of accountability over their institutions.

While not global in nature, this approach has strong extraterritorial consequences since data moves across borders. Indeed, it can be framed as the EU imposing its views of cyberspace data on the rest of the world. If the EU is committed to defending its data protection approach, then it should embrace this extraterritorial nature. Data protection is a political endeavor and the EU will continue to argue with its partners about how it thinks data flows should be organized in an interconnected world. Ultimately, the EU’s actions shape the development of norms on data sharing and values and policies of personal data around the world.

Nikolas Ott is a Mercator Fellow focussing on role of international legal norms and confidence-building measures to reduce and prevent the escalation of conflict in cyberspace. He studied International Relations (MA) at The Fletcher School of Law and Diplomacy (Tufts University) and Political Science (BA) at the Freie Universität Berlin and the Pontificia Universidad Católica de Chile. Nikolas gained professional experience through working at the Delegation of the European Union to the United States, the Dean’s Office at the Hertie School of Governance and the headquarters of the German Federal Foreign Office. 

 

Hugo Zylberberg is a Cyber Fellow at Columbia’s School of International and Public Affairs, in charge of coordinating the programs on Cybersecurity, Internet Governance and the Digital Economy, as well as a member of the Research Center Values and Policies of Personal Data at the Institut Mines-Telecom in Paris. He graduated with a Master in Public Policy from the Kennedy School of Government at Harvard and with a Master and a Bachelor of Science from the École polytechnique, and focuses on the geopolitics of cyberspace from a technical, economic and political perspective.

 

Cover Photo Credit: The data we exhaust on the Internet is of an increasingly personal nature, including fingerprints (Source: Flickr, CPOA).


[i] Neil Hughes, “Apple expands HealthKit in iOS 9 to track sexual activity, ovulation, UV exposure, water intake,” 9 June 2015, http://appleinsider.com/articles/15/06/09/apples-expands-healthkit-in-ios-9-to-track-uv-exposure-water-intake-sexual-activity-ovulation.

[ii] Boston Consulting Group, “The Value of Our Digital Identity” (Boston: Boston Consulting Group, November 2012), http://www.libertyglobal.com/PDF/public-policy/The-Value-of-Our-Digital-Identity.pdf.

[iii] IBM, “The Data Society: Creating Value from Your Digital Information for Everyday Life,” 2013, http://download.intel.com/newsroom/kits/research/2013/pdfs/Data_Society_Backgrounder.pdf.

[iv] Trebor Scholz, ed., Digital Labor: The Internet as Playground and Factory (New York: Routledge, 2013) https://www.routledge.com/products/9780415896955

[v] Jaron Lanier, Who Owns the Future?, Simon & Schuster trade paperback edition (New York: Simon & Schuster Paperback, 2014), http://www.jaronlanier.com/futurewebresources.html.

[vi] Ibid.

[vii] Scholz, Digital Labor.

[viii] Joseph Turow, Michael Hennessy, and Nora Draper, “The Tradeoff Fallacy: How Marketers Are Misrepresenting American Consumers and Opening Them Up to Exploitation,” Report Series at the Annenberg School for Communication, University of Pennsylvania, 2015, 3, https://www.asc.upenn.edu/sites/default/files/TradeoffFallacy_0.pdf.

[ix] European Commission, “The General Data Protection Regulation,” 2016, http://www.consilium.europa.eu/en/policies/data-protection-reform/data-protection-regulation/.

[x] European Commission, “Official Journal of the European Communities – Legislation,” Official Journal of the European Communities Volume 38, no. L 281 (23 November 1995), http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=OJ:L:1995:281:TOC.

[xi] Ibid., 4 Paragraph 33.

[xii] European Commission, “Reform of EU Data Protection Rules,” 2012, http://ec.europa.eu/justice/data-protection/reform/index_en.htm.

[xiii] European Commission, “Press Release: Commission Proposal on New Data Protection Rules to Boost EU Digital Single Market Supported by Justice Ministers” (European Commission, 15 June 2015), http://europa.eu/rapid/press-release_IP-15-5176_en.htm.

[xiv] THE COURT (Grand Chamber), Google Inc. v Agencia Española de Protección de Datos (AEPD), (Court of Justice of the European Union 2014), http://curia.europa.eu/juris/document/document_print.jsf?doclang=EN&docid=152065.

[xv] Holly Evarts, “New Tool Makes Online Personal Data More Transparent,” 18 August 2014, https://engineering.columbia.edu/new-tool-makes-online-personal-data-more-transparent.

[xvi] Christine A. Varney, “Consumer Privacy in the Information Age: A View from the United States” (The Privacy & American Business National Conference, Washington, DC, 9 October 1996), https://www.ftc.gov/es/public-statements/1996/10/consumer-privacy-information-age-view-united-states.