Data Protection: An Ethical Bricolage for the European Union?

On October 24, Giovanni Buttarelli, European Data Protection Supervisor (EDPS), gave the opening address of the Public Session of the 40th Edition of the International Conference of Data Protection and Privacy Commissioners in Brussels.  Because Buttarelli’s address ranged widely across numerous aspects of data protection, it is important for people and companies affected by the General Data Protection Regulation (GDPR) to get a sense of his thinking about the current and possible future states of data protection regulation by the European Union (EU).

In keeping with the theme of the session, “Debating Ethics,” Buttarelli stated at the outset that the conference “is not a privacy or data protection conference,” but rather “a conference about the human values which underpin privacy and data protection.”  He asserted “that we are now living through a new generational shift in the respect for privacy, . . . towards establishing a sustainable ethics for a digitized society.” That shift, in his view, is driven “by the globalization of the economy,” “socio-technological forces,” “digitization of almost everything in the economy, politics, and government,” and “[a]bove all . . . by the prospect of human decision making, responsibility and accountability being delegated to machines.”

Buttarelli then stated that “[i]n today’s digital sphere, there is no . . . ethical consensus” – either in Europe or at a global level – but that “we urgently need one” because “digital technologies and data flows are already intensely global.”  “To cultivate a sustainable digital ethics,” he maintained,

we need to look, objectively, at how those technologies have affected people in good ways and bad[.]  We need a critical understanding of the ethics informing decisions by companies, governments and regulators whenever they develop and deploy new technologies. Technology is still, for now, predominantly designed and deployed by humans, for purposes defined by humans.  But we are fast approaching a period where design, deployment and control of new technologies and technological processes are delegated to machines.

Buttarelli warned that “we are fast approaching a period where design, deployment and control of new technologies and technological processes are delegated to machines.”  As illustrations of that point, he cited five “case studies”:

  1. “[K]iller drones.”
  2. “[A]lgorithmic decision-making in criminal sentencing,” which “submits individuals to life-changing decisions based on opaque criteria with little or no due process.”
  3. “[T]he role of social media whose unaccountable algorithmic decision-making has been weaponized by bad actors in ethnic conflict zones, with at times appalling human consequences, notably in Myanmar.”
  4. Blockchain technologies, which “if its current rate of growth continues, . . . will generate as much carbon emissions worldwide as the whole of the United States.”
  5. “[R]ights for robots,” including consideration of “the ‘robotised humans’ of today.”

Buttarelli stated that these and other practices “call into question basic notions of human dignity.” In his view, “[t]hose responsible for these phenomena may be well-intentioned,” but “their ethics are deeply questionable.”  He maintained that

we are witnessing a state of cognitive dissonance on a global scale.  We need to ask whether our moral compass been suspended in the drive for scale and innovation. At this tipping point for our digital society, it is time to develop a clear and sustainable moral code.

While Buttarelli did not define what such a moral code would look like, he noted that “the European legislator [sic] did not think about ethics when it drafted the GDPR.”  He immediately added that his statement

is not a criticism of the GDPR.  It is a reality check on the limitations of any law, even a comprehensive one. Laws establish the minimum standard.  Best practices are assumed to go beyond the minimum standard. So for me, compliance with the law is not enough. . . . From my perspective, ethics come before, during and after the law.  It informs how laws are drafted, interpreted and revised.  It fills the gaps where the law appears to be silent.  Ethics is the basis for challenging laws.

Finally, Buttarelli explained why data protection authorities should be involved in the debate over a moral code for information.  He informed the Conference that “[a]ccording to the results of our consultation, 86% of respondents believe authorities should play a role in the governance of digital ethics.”  He briefly commended privacy professionals for developing a range of compliance tools, but stated that “self-regulation alone is not the solution. For  us as data protection authorities, I believe that ethics is among             our most pressing strategic challenges. We have to be able to understand technology, and to articulate a coherent ethical framework.”  He added that “the more personal data processing affects the collective interest, the less we can look to the GDPR for answers.  Perhaps ethics will fill that void.”

* * *

For those who pay attention to transnational data-protection issues, there should be no surprise at Buttarelli’s focus on this topic.  Since the publication of the EDPS Strategy 2015-2019, one of the EDPS’s stated priorities has been “to assess an ethical dimension beyond the application of data protection rules; we want to encourage a better informed conversation on what big data and the internet of things will mean for our digital rights.”  In his Opinion of September 11, 2015, Buttarelli went further in urging that the concept of human dignity – which Article 1 of the European Union’s Charter of Fundamental Rights proclaimed “is inviolable” and “must be respected and protected” — “should be at the heart of a new digital ethics.”  To help define that “new digital ethics,” he promised to establish an EU data protection ethics board, which has now taken place.

Because Buttarelli’s address at the conference was so discursive, it is difficult to parse the text of his address and determine the extent to which he intended certain remarks merely to provoke discussion and exchanges of ideas, and other remarks to promote a potential regulatory agenda for EU data protection regulators.  Other recent comments by Buttarelli indicate that at the least, he is purposefully laying the groundwork for the latter.

In an October 3 interview with TechCrunch, Buttarelli raised a question relating to what TechCrunch termed “privacy hostile business models,” and indicated his doubt that the GDPR alone “will be remedy enough to fix all privacy hostile business models.”  To that end, TechCrunch commented, he “is actively pushing to accelerate innovation and debate around data ethics — to support efforts to steer markets and business models in, well, a more humanitarian direction.”  Not surprisingly, Sir Tim Berners-Lee, who “has been increasingly strident in his criticism of how commercial interests have come to dominate the Internet by exploiting people’s personal data,” was a key speaker at the Brussels Conference.  In addition, Buttarelli stated that “the legal framework” can be of help in addressing “a lot of inequality in the tech world,” but that it “will not give all the relevant answers to identify what is legally and technically feasible but morally untenable.”

Buttarelli also stated in the TechCrunch interview that in May 2019, on the anniversary of the GDPR’s coming into force, “he will publish a manifesto for a next-generation framework that envisages active collaboration between Europe’s privacy overseers and antitrust regulators.“  As it is unlikely that a comprehensive EU-wide ethical or moral code can be drafted and approved by that time, Buttarelli cannot help but infuse that privacy-antitrust framework manifesto with certain moral and ethical requirements of his and EDPS officials’ choosing, rather than requirements developed through extensive public consultation and gradual development of consensus.

Such a course of action should be deeply concerning for the commercial sector as well as the general public.  Although law can have a moral force, U.S. Supreme Court Justice Oliver Wendell Holmes, Jr., wrote that there also can be “a confusion between morality and law, which sometimes rises to the height of conscious theory, and more often and indeed constantly is making trouble in detail without reaching the point of consciousness.”

Buttarelli’s call for “a clear and sustainable moral code” and professed desire for “a coherent ethical framework” may sound laudable at first.  But it is not at all clear that there can be such a thing as a single “coherent ethical framework” for information in the digital age.  Each of the topics that Buttarelli briefly mentioned – blockchain, drone technology, robotic “rights”, and the use of algorithms in sentencing and social media – certainly involve ethical and moral issues that deserve thoughtful discussion.  It is highly improbable, however, that a single ethical framework exists, or can be devised, that can be applied to such widely varied issues in a coherent and consistent manner.

Since that framework does not now exist, as Buttarelli himself would have to concede, the true significance of Buttarelli’s address is that it offers an ethical bricolage of ethics- and morality-related ideas, notions, and sentiments, from which data-protection authorities can pick and choose freely as they seek to justify further expansion of their authority.  For that reason, interested observers – which should include any and all persons who could be affected by future expansion of EDPS-driven regulation – should closely watch the EDPS’s actions in the next several months and be prepared to challenge future regulatory proposals that lack ethical and policy coherence.  Justice Holmes’s warning about “those who think it advantageous to get as much ethics into the law as they can” can be just as applicable to regulators as it is to legislators.

1 thought on “Data Protection: An Ethical Bricolage for the European Union?”

Leave a comment