Privacy and Data Protection
Privacy and Data Protection
Abstract and Keywords
This chapter covers privacy and data protection. This entails a series of legal requirements for development and design, for the default settings, and for the employment of computer architectures. In addition, the chapter defines the right to privacy as a subjective right, attributed by objective law, which may be national (constitutional) law, international human rights law, or supranational law (EU fundamental rights law). The chapter first confronts the landscape of human rights law at the global, national, and EU level. It then inquires into the right of privacy, as guaranteed under the ECHR and the Charter of Fundamental Rights of the European Union (CFREU), and finally provides an extensive analysis of the new fundamental right to data protection, as guaranteed by the CFREU and protected by the General Data Protection Regulation (GDPR).
Working with computing systems, whether developing, integrating, or testing them, will often involve working with data. Sometimes this data will be personal data, and sometimes these systems will have a major impact on the private life of those targeted by these systems (think of data brokers, credit rating agencies), or those interacting with these systems (in the case of social networks, search engines). In this chapter, we will investigate the legal domain of privacy and data protection, which entails a series of legal requirements for the development and design, for the default settings, and for the employment of computer architectures. This chapter can in no way provide a comprehensive overview of privacy and data protection, which would require two separate books at the least. However, the purpose of this book is not to turn computer scientists into lawyers. The purpose is to provide some real taste and true bite of the law on legal topics that are highly relevant for computer science. Therefore, please check the references for further reading and for real world scenarios check with a practising lawyer.
5.1 Human Rights Law
When tracing the history of human rights, we first encounter the English Bill of Rights of 1689, followed by the revolutionary French Déclaration des Droits de l’Homme et du Citoyen of 1789 and the US Bill of Rights of 1791. Though the famous Magna Charta of 1215 may seem an early example of a (p.100) human rights charter, it did not attribute what we now call human rights. Instead, it ensured that the feudal lords were able to restrict the powers of the King, while protecting jurisdiction over their own subjects against royal interference. The era of the Magna Charta saw the struggle between a feudal society and an emergent royal power; this was not yet the era of a powerful modern state that managed to subject each and every person on its territory to its jurisdiction. The rights provided by the Magna Charta were mainly reserved for powerful lords, who wished to preserve the powers they had over their own land and their own serfs against the claims of the king.
5.1.1 Human rights as defence rights against the modern state
The rise of the modern state must be situated in the beginning of what historians call the era of ‘Modernity’, around the fifteenth and sixteenth century. It was the rise of the modern, bureaucratic state that warranted new types of protection against the monopolistic powers of the King and his clerks (feeding on the impressive affordances of proliferating printed text, see section 1.4). The rise of the idea of human rights coincides with the rise of sovereignty (see section 1.4 and 4.1.2).
The human rights declarations of the seventeenth and eighteenth centuries provided those subject to the power of a sovereign state with an entitlement to civil and political rights, emulating their status to that of individual right bearers and constituents of the polity.
Some attribute the power of this attribution to the ‘endowment bias’; if people come to believe they ‘have’ these rights, they will invest in ‘keeping’ them. If the struggle this entails succeeds, these rights will eventually be instituted as effective subjective rights. In due course, respect for human dignity and a new emphasis on the centrality of the individual reconfigured the idea of law and politics, laying the groundwork for the more ‘practical and effective’ human rights protection of the second half of the twentieth century.
(p.101) However, in the context of international law, human rights have been citizens’ rights rather than human rights, depending on constitutional protection and citizenship, thus offering little protection for subjects of rogue states. After the atrocities of the Second World War, states decided to elevate the protection of human rights to the level of international law, starting with the Universal Declaration of Human Rights of 1948. Though this declaration had no binding force, it was soon followed by various treaties at the global and regional level, aiming to finally institute human rights as enforceable subjective rights against the state.
5.1.2 From liberty rights to social, economic, and further rights
These legal goods are: privacy, non-discrimination, bodily integrity, freedom of movement, the presumption of innocence, a fair trial, freedom of expression, freedom of association, freedom of religion, and voting rights. Note that these legal goods are considered worthy of protection as public goods, because a society that does not protect them cannot support a viable democracy that depends on independence of thought and unhindered development of both individual and group identities. For that reason, they are also called civil and political rights. The focus is on public goods that protect individual persons as autonomous agents in a democratic polity and on negative obligations of the state towards its citizens.
(p.102) These rights are often called social and economic rights. To actually provide these protected goods, a state cannot restrict itself to respecting liberty rights. The second-generation human rights impose positive obligations on states to create and sustain the goods it must protect. This implies that the second generation of human rights addresses states with ‘instruction norms’, rather than providing citizens with directly enforceable subjective rights. To exercise a right to employment, an economic system must be in place that enables such a right, meaning that second generation human rights require states to build institutions capable of supporting economic welfare and a fair distribution of access to social and economic goods.
Here, we encounter rights to construct and develop group identities and rights to a sustainable environment. These rights have even less of a straightforward relationship with individual entitlement, focusing on the rights of groups (e.g. the right to self-determination for indigenous peoples, which we already encountered in section 4.2.1, as a fundamental principle of international law) and obligations towards the natural environment on which human society depends (responsible innovation, sustainable development).
5.2 The Concept of Privacy
Before investigating the right to privacy as part of the first generation of human rights law, we will first inquire into the nature of privacy itself. The reason is that computer science has a specific relationship with privacy, notably in the context of digital security and cryptography. In that context, privacy is often seen as a subset of security, focused on hiding or removing the link between data and whoever the data refer to, or on encrypting the data to safeguard confidential data against eavesdropping. This has, as a consequence, meant that privacy protection is restricted to (1) anonymization or pseudonymization of personal data, by way of deleting or separating identifiers and to (2) hiding the content by means of encryption or other security measures. The focus on hiding has (p.103) generated research fields such as differential privacy and reidentification metrics, based on e.g. cryptography and key-management, k-anonymity, linkability metrics, and so on.
Though such research is of crucial importance to protect privacy, one must not mistake issues of identifiability and confidentiality for issues of privacy as the latter concerns far more than mere technical identifiability or readability.
Should we qualify this data as part of the privacy of the person the data refers to?
5.2.1 Taxonomies and family resemblance
Many authors have made attempts to define privacy by summing up the common denominators of what is generally seen as falling within the scope of privacy. This turns out to be a questionable undertaking, because the concept is as elusive as it is pertinent. Another way of tackling the issue of understanding privacy is to define it in terms of family resemblance. (p.104)
Solove notes that some of these categories focus on goals, others on means, while they are in various way interdependent. Taken separately, none of these definitions would exhaust the concept of privacy, being either too broad or too narrow. He warns that this is therefore not a taxonomy, which would assume mutually independent features of the same thing. On the contrary, the idea of a family resemblance means that privacy cannot be defined in terms of necessary and sufficient conditions, because there is no common core to the different conceptions of privacy. Instead, Wittgenstein’s notion of family resemblances enables us to take a pragmatic approach, recognizing the contextual, historical, dynamic nature of privacy, such as relating to family life, the body, or the home. This approach is bottom-up rather than abstract and acknowledges that, in the end, privacy is best seen as a set of practices rather than a formula. The concept of family resemblance was introduced as a way to understand the meaning of words by Wittgenstein in his Philosophical Investigations. The concept is very interesting for computer science as it explains why translating concepts into ontologies or a semantic web may entail a loss of meaning. I will therefore quote The Stanford Encyclopedia of Philosophy to elucidate this understanding of meaning:
There is no reason to look, as we have done traditionally—and dogmatically—for one, essential core in which the meaning of a word is located and which is, therefore, common to all uses of that word. We should, instead, travel with the word’s uses through ‘a complicated network of similarities overlapping and criss-crossing’ (PI 66).1 Family resemblance also serves to exhibit the lack of boundaries and the distance from exactness that characterize different uses of the same concept. (p.105) Such boundaries and exactness are the definitive traits of form—be it Platonic form, Aristotelian form, or the general form of a proposition adumbrated in the Tractatus.2 It is from such forms that applications of concepts can be deduced, but this is precisely what Wittgenstein now eschews in favor of appeal to similarity of a kind with family resemblance.
To emphasize the elusive nature of privacy, we briefly follow Solove’s discussion of the categories enumerated above.
This understanding of privacy is related to intimacy, to the idea of drawing boundaries around a small circle of people with whom one dares to expose oneself, sharing information that might otherwise be used to shame a person, or to diminish or ridicule their agency. Intimacy relates to trust, not in the sense of confidence and security, but in the sense of trusting others enough to take the risk of being betrayed. One could ask what information is intimate, but this assumes that ‘intimacy’ is a property of information, whereas all depends on the situation, the context, and the roles played by intimate others. In some situations, financial information, or information shared with a health insurance company, may be intimate information, because it reveals to others what makes a person vulnerable to shame, ridicule, or even to life-threatening manipulation.
If we then take together privacy as limited access, and secrecy, anonymity and solitude, we can address the legal notion of third-party disclosure.
In the United States, the Supreme Court decided, in 1967,3 that once a person exposes their personal data to a third party such as banks or other service providers, they have no reasonable expectation of privacy regarding access (p.106) by the government. This so-called ‘third-party doctrine’ reflects an approach to privacy that is radically different from the European approach, which does not presume that disclosing private information to one entity necessarily implies that other entities are now free to obtain and use such information.
Note that the United States have since enacted legislation requiring a warrant for access to specific data, thus providing specified protection for, for example, financial data and telephone data. We have already encountered the case of US v. Jones (followed by Riley and Carpenter, see section 2.1.2, n. 2), where the Supreme Court decided that police warrants were necessary in the case of GPS trackers, information on a cell phone, and cell-site records of a wireless carrier. These judgments may lead to the end of the third-party doctrine, depending on subsequent case law.
Defining privacy in terms of control comes close to thinking of personally identifiable information (PII) as if it were the property of the person it concerns. PII is, just like informational privacy, a term used in the United States, whereas in the EU we generally speak of data protection and personal data. Thinking of PII in terms of property creates a number of problems, as neither data nor information are rivalrous or exclusionary. One person ‘having’ certain information does not necessarily imply that others do not ‘have’ that same information, whereas one person possessing a book implies that others do not possess it. It is therefore important to distinguish between control over ‘access to’ and ‘usage of’ information on the one hand, and property rights in information on the other. The latter applies in the case of intellectual property rights (e.g. copyright or patent), but not in the case of personal data. Below, we will discuss to what extent EU data protection law provides control to data subjects (those to whom personal data refers), but we can already point out here that full control over one’s personal data ignores the relational nature of personal data. To illustrate the latter point, we can think of Robinson Crusoe and ask the question whether he had a name before Friday came to his island. We have a name to be singled out by others, to be addressed by others, and to appear as a singular individual person before others. This implies that, though (p.107) we need some control over the sharing of our name, such control cannot be unlimited. Without fellows to address us, we effectively ‘have’ no name.
Indeed, this is how Agre and Rotenberg defined privacy, highlighting the interrelationship between negative and positive freedom. This also suggests that liberty and autonomy overlap and support each other. For instance, what has been called ‘decisional privacy’ (e.g. the right of a woman to decide about an abortion) clearly marks the nexus of positive freedom (to decide an abortion) with negative freedom (to be free from unreasonable constraints on such a decision). The crux of Agre and Rotenberg’s definition resides in the requirement that people are free from unreasonable constraints, not just any constraints. In case law, legislation, and doctrine the concept of ‘reasonable’ or ‘unreasonable’ is of prime importance. Instead of framing this as a source of uncertainty, because of its prima facie vagueness, this concept can be seen as an aid in aligning different conceptions of legal goods that warrant protection. Demanding that a duty of care is exercised in a reasonable way acknowledges that ‘a duty of care’ cannot be defined in the abstract, but is better understood in terms of family resemblances. The duty of care of a mother, an employer, a manufacturer, and a social network provider may not share any common element; they nevertheless align along the lines of reasonable expectations and proper checks and balances, considering the relevant context and the roles of the parties involved. Similarly, reasonable expectations of privacy depend on context, on roles played, on checks and balances, and meaningful choice. This is not because privacy is a vague concept but because the practice of privacy is complex, requiring acuity to what is at stake for whom.
In the end, defining privacy is a decision to be taken when confronted with its violation. As Solove saliently writes in reference to a famous American philosopher:
‘[K]nowledge is an affair of making sure,’ Dewey observed, ‘not of grasping antecedently given sureties.’
This is what the courts must achieve every time a case is brought before them: making the difference that makes a difference.
5.2.2 Privacy and technology
After tracing the conceptual challenges of delineating privacy, I will briefly trace the relationship between privacy and technology. Some of us may think that privacy is a property of people in general, just like animals often display what ethologists call ‘critical distance’ from each other.
Rather than being a matter of seclusion, Altman frames privacy as a continuous process of sharing and excluding, based on societal practices that are in turn dependent on technological affordances of the environment. In that sense, privacy can be detected in most human societies, though under different names and with very different constraints.
In a famous article in the Harvard Law Review, US legal scholars Samuel Warren and Louis Brandeis discussed the need to protect oneself against (p.109) publication of photographs without permission, to enable social withdrawal. In that article, they formulated the right to privacy as the right to be left alone, basically arguing for the existence of a privacy tort whenever this right was infringed upon without justification. Interestingly, privacy was thus introduced as a private law issue rather than a constitutional right. When Brandeis later served as justice in the Supreme Court, however, he argued that such a right to be left alone must be ‘read into’ the US Constitution, notably into the Bill of Rights, thus vouching for a right to privacy against the state. The rise of mass media and photography afforded massive dissemination of pictures taken, thus infringing the privacy of those concerned in a previously unprecedented manner. This, in turn, gave rise to defence mechanisms to safeguard one’s capability to withdraw from such exposure.
After the Second World War, a new technological infrastructure surfaced to enable and improve public administration, in the form of computerized databases. This resulted in the collection and storage of myriad data relating to identifiable citizens, enabling government agencies to better target their constituency and to engage in what would now be termed ‘evidence-based policy’. This, in turn, raised the question to whom this data belongs. In 1967, Alan Westin wrote a seminal work on Privacy and Freedom, taking a clear stand on the question of who should—by default—be capable of controlling access to data concerning individual persons. Privacy, he wrote, is:
the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.
This concept of informational privacy, as control over information, informs much of the debate about privacy and data protection in our current age. It is interesting to note that it emerged in counterpoint to the rise of databases in public administration, as well as private enterprise. The fact that data was collected, sorted, and recorded, enabling retrieval as well as aggregation, gave rise to new types of transparency, and new types of threats to personal identity. This was related to the fact that in this era the data collected and stored was mostly stable data, allowing the mapping of both individuals and populations (p.110) in consistent and foreseeable way, without the kind of dynamic and unstructured big data capture that characterizes the current era.
After the rise of the internet and the world wide web, combined with the capture of big data and data-driven techniques to infer new information, the need for a more complex and contextual right to privacy seems obvious. Negative freedom will not do, as data abounds and is captured beyond one’s control on a permanent basis. For the same reason, positive freedom seems unattainable, as consent loses its meaning amidst the volume, variety, and velocity of data capture, storage, and use. A more practical and effective way of understanding privacy should therefore combine negative and positive freedom, while highlighting the relationship with identity-construction, not merely identification.
The definition of Agre and Rotenberg, referred to above, may be the most apt for the era of proactive and pre-emptive computing infrastructures, depicting the right to privacy as:
the right to be free of unreasonable constraints on the building of one’s identity.
For some readers, this may sound overly vague or complicated. To confront a complex, volatile, invasive, and pre-emptive environment we will, however, need an understanding of privacy that goes beyond the hiding of personal data.
5.3 The Right to Privacy
Privacy is a value, an interest, a right, or a good. It can be analysed from an ethical perspective (as a value, a virtue, or duty), from an economic perspective (as a utility, a preference, or an interest), and from the perspective of political theory (as a public and a private good). In this work, we will focus on the legal perspective, tracing positive law’s applicability to issues of privacy. Below, we will discuss the right to privacy from the perspectives of constitutional, international, and supranational law, ending with a discussion of Article 8 ECHR.
The right to privacy is a subjective right, attributed by objective law. The most obvious branch of objective law that attributes the subjective right of privacy is constitutional law, which often contains a section that aims to protect citizens against overly invasive powers of the state. Historically, human rights initially played out in the vertical relationship between state and citizens, not in the horizontal relations between private parties. The industrial revolution of the nineteenth century gave rise to powerful economic actors whose ability to infringe privacy, freedom of information, and non-discrimination increasingly matched the powers of the state.
In many states outside the Council of Europe, the Constitution provides the main protection against infringements of the right to privacy. For instance, in the United States, even though neither the 1787 US Constitution nor the 1791 Amendments to the US Constitution (known as the Bill of Rights) explicitly refer to a right to privacy, the Supreme Court of the United States has nevertheless interpreted various articles of the Bill of Rights as safeguarding an individual right to privacy,4 notably based on the Fourth Amendment:
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
We can read these protections in terms of legal conditions and legal effect, by stating that ‘searches and seizures’ by government officials are only lawful if:
• there is probable cause,
• a warrant has been issued,
• which contains limitations as to what is allowed.
As we have already seen in section 2.1.2 and 5.1.2, the question here is (1) whether this right protects against violation of property rights (trespass) or also against violation of reasonable expectations of privacy that do not depend on property and (2) whether search and seizure of, for example, a mobile phone falls within the scope of the Fourth Amendment, as a phone is neither part of a person, a house, paper, or effects.
5.3.2 The right to privacy: international law
Protection of human rights requires a resilient system of checks and balances, that is, a series of institutional safeguards to ensure that the state does not claim unreasonable exceptions and faces a stringently independent judiciary (p.113) to keep the powers of the state ‘in check’. As noted above, the need to protect subjects of the state against the state, gave rise to international human rights law, which provides an extra layer of checks and balances. Privacy is explicitly protected by Article 17 of the United Nations (UN) International Covenant on Civil and Political Rights (ICCPR) of 1966, and by Article 8 ECHR of 1950, two examples of international law. Both articles are similar, we quote Article 8 ECHR to give the reader a first taste:
1. Everyone has the right to respect for his private and family life, his home and his correspondence.
2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.
The UN ICCPR has global application, with currently 178 signatories and 172 ratifications, but its enforcement mechanisms are relatively weak compared to the ECHR. In Article 34, the ECHR provides citizens of the forty-eight contracting parties with an individual right to complain to the European Court of Human Rights (ECtHR):
The Court may receive applications from any person, non-governmental organisation or group of individuals claiming to be the victim of a violation by one of the High Contracting Parties of the rights set forth in the Convention or the protocols thereto. The High Contracting Parties undertake not to hinder in any way the effective exercise of this right.
The ECHR, however, does not have global application, as it only applies within the jurisdiction of the Council of Europe.
5.3.3 The right to privacy: supranational law
Since 2009, when the CFREU came into force, the protection of human rights has gained even more traction, adding a second European Court with competence to test legislation, decisions, and actions against a catalogue of human rights. This protection, offered at the level of supranational law, is applicable whenever member states (MSs) ‘are implementing Union law’ (Article 51 (p.114) CFREU). As human rights developed with the rise of the modern state, they further developed with the rise of supranational jurisdiction. The prevailing powers of the institutions of the EU demand countervailing powers in the form of supranational fundamental rights.
5.3.4 Article 8 ECHR
In this section, we will discuss one of the most crucial legal rights of this book. The right to privacy that is articulated in Article 8 ECHR is not only relevant for bodily integrity, decisional privacy, and the other aspects of privacy, but also directly affects issues of cybercrime and copyright. This is due to the fact that cybercrimes may violate privacy (hacking, data breaches), or that copyright holders may violate privacy when disseminating their works (photographs, texts), but also because the investigative measures that aim to detect cybercrime and violations of copyright often infringe upon the right to privacy as protected in Article 8.
Here, we develop a first analysis of the legal conditions stipulated by art. 8 ECHR, how they are explained by the ECtHR, and the legal effects they generate.
Article 8 consists of two paragraphs. The first paragraph concerns the question of whether privacy is infringed, the second paragraph clarifies under what conditions an infringement is justified.
1. Everyone has the right to respect for his private and family life, his home and his correspondence.
The ECtHR takes the view that these concepts require a broad rather than a narrow interpretation, bringing a wide variety of situations, events, relationships, and contexts under the protection of Article 8.
(p.115) Private life can be at stake in the context of work, meaning that a search of an office space may be an infringement of privacy.5 Family life is at stake when a state prohibits members of a family from living together, for instance in the case of a refusal to provide a residence permit for a partner from another state, or of a parent wishing to further develop a relationship with their child despite not being married to the other parent. Protection of the home may become relevant when a person has taken residence in a house they neither own nor rent, meaning that the need to respect one’s home is not dependent upon ownership or contract. The confidentiality of communication has been interpreted to include letters, telephone calls, and more recently all types of internet-enabled communication that is not public. Privacy, as protected by Article 8, clearly concerns physical, spatial, contextual, decisional, communicative, and informational privacy, and although Article 8 addresses the contracting states, its indirect horizontal effect has been recognized by the ECtHR, requiring states to ensure proper protection against violations by others than the state. Note that the individual complaint right of the ECHR can only be invoked against a state, not against a company. To invoke direct horizontal effect, a person needs to sue the tortfeasor in a national court.
2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.
The articulation of legitimate aims in Article 8.2 is rather inclusive, which means that the ECtHR seldom finds reason to endorse the claim that the state lacked a legitimate aim.
Many of the cases where the ECtHR (the Court) finds that Article 8 has been violated concern the legal condition that the infringement must be ‘in accordance with the law’ to be justified. This basically refers to the legality principle of constitutional law (see section 3.3).
Note that the Court will not merely check legislative or regulatory provisions, but test practical arrangements and actual safeguards to establish whether the infringing measures were taken ‘in accordance with the law’. Throughout its case law, the ECtHR demands that the rights attributed in the ECHR are both ‘practical and effective’, stating that:6
[t]he Convention is intended to guarantee not rights that are theoretical or illusory but rights that are practical and effective ( … ).
If privacy is infringed with a legitimate aim, based on a legal competence that is accessible, foreseeable, while having sufficient safeguards, the final test is a proportionality test.
• Under this criterion the Court will examine the gravity, invasiveness, and seriousness of the infringement in relation to the importance and seriousness of the aim served.
• This criterion basically requires that the measures taken can reasonably be expected to be effective, because a measure that is not effective cannot be necessary.
• The proportionality test includes a subsidiarity test; if another measure which is less infringing is feasible or sufficiently effective, the measure is not proportional.
5.3.5 Case law Article 8 ECHR regarding surveillance
When developing computing architectures, whether in the context of databases, streaming data, machine-to-machine communication, knowledge discovery in databases, machine learning, or cryptographic infrastructures, computer scientists lay the foundations for the ICIs that enable the processing, storage, interlinking, and inferencing of behavioural and other personal data. This may regard online clickstream behaviour, location, and mobility data, energy usage behaviours, biometric gait behaviour, and a plethora of communication data, including both content and metadata. Governments, tasked with the investigation and prosecution of criminal offences and the protection of national and public security, have many incentives to gain access to such data. Apart from the struggle against serious crime and threats to national security, governments need to collect taxes, attribute social benefits, take precautionary measures regarding public health, and safeguard the economic welfare of the country. All these tasks fall within the scope of the legitimate aims enumerated in Article 8.2 ECHR. This raises the question under what (p.118) conditions surveillance measures can be qualified as ‘in accordance with the law’ and if so, when they are considered ‘proportional’ to the targeted aim.
Surveillance measures by the police may regard post-crime investigatory measures (to identify an offender after a crime has been committed) or pre-crime investigations (to prevent potential offending, or to foresee likely offences). To understand how the Court deals with various types of electronic surveillance, we will discuss two cases of post-crime surveillance and two cases of pre-crime surveillance (including surveillance by the intelligence services, which falls outside the domain of criminal law).
184.108.40.206 Post-crime surveillance
In 1984, in Malone v. UK,7 the ECtHR determined that the United Kingdom was in breach of Article 8 ECHR, where it allowed the interception of telephone conversations by the police upon a warrant issued by the Secretary of State. The Court determined that for such a measure to be ‘in accordance with the law’, it must not merely have a basis in domestic law (meaning a legal power), but must also be foreseeable and sufficiently limited as required by the rule of law:
68. Since the implementation in practice of measures of secret surveillance of communications is not open to scrutiny by the individuals concerned or the public at large, it would be contrary to the rule of law for the legal discretion granted to the executive to be expressed in terms of an unfettered power. Consequently, the law must indicate the scope of any such discretion conferred on the competent authorities and the manner of its exercise with sufficient clarity ( … ).
When applying this interpretation, the Court finds that:
79. The foregoing considerations disclose that, at the very least, in its present state the law in England and Wales governing interception of communications for police purposes is somewhat obscure and open to differing interpretations. The Court would be usurping the function of the national courts were it to attempt to make (p.119) an authoritative statement on such issues of domestic law (see, mutatis mutandis, the Deweer judgment of 27 February 1980, Series A no. 35, p. 28, in fine, and the Van Droogenbroeck judgment of 24 June 1982, Series A no. 50, p. 30, fourth sub-paragraph). The Court is, however, required under the Convention to determine whether, for the purposes of paragraph 2 of Article 8 (art. 8-2), the relevant law lays down with reasonable clarity the essential elements of the authorities’ powers in this domain.
Detailed procedures concerning interception of communications on behalf of the police in England and Wales do exist (see paragraphs 42–49, 51–52 and 54–55 above). What is more, published statistics show the efficacy of those procedures in keeping the number of warrants granted relatively low, especially when compared with the rising number of indictable crimes committed and telephones installed (see paragraph 53 above). The public have been made aware of the applicable arrangements and principles through publication of the Birkett report and the White Paper and through statements by responsible Ministers in Parliament (see paragraphs 21, 37–38, 41, 43 and 54 above).
Nonetheless, on the evidence before the Court, it cannot be said with any reasonable certainty what elements of the powers to intercept are incorporated in legal rules and what elements remain within the discretion of the executive. In view of the attendant obscurity and uncertainty as to the state of the law in this essential respect, the Court cannot but reach a similar conclusion to that of the Commission. In the opinion of the Court, the law of England and Wales does not indicate with reasonable clarity the scope and manner of exercise of the relevant discretion conferred on the public authorities. To that extent, the minimum degree of legal protection to which citizens are entitled under the rule of law in a democratic society is lacking.
80. In sum, as far as interception of communications is concerned, the interferences with the applicant’s right under Article 8 (art. 8) to respect for his private life and correspondence (see paragraph 64 above) were not ‘in accordance with the law’.
In this case, Malone not only claimed that the interception of the content of his telephone conversations violated his right to privacy under the Convention, but also that the capture of what we would now call metadata violated said right. The Court states, with regard to this capture, known as ‘metering’:
83. The process known as ‘metering’ involves the use of a device (a meter check printer) which registers the numbers dialled on a particular telephone and the time (p.120) and duration of each call (see paragraph 56 above). In making such records, the Post Office—now British Telecommunications—makes use only of signals sent to itself as the provider of the telephone service and does not monitor or intercept telephone conversations at all. From this, the Government drew the conclusion that metering, in contrast to interception of communications, does not entail interference with any right guaranteed by Article 8 (art. 8).
87. Section 80 of the Post Office Act 1969 has never been applied so as to ‘require’ the Post Office, pursuant to a warrant of the Secretary of State, to make available to the police in connection with the investigation of crime information obtained from metering. On the other hand, no rule of domestic law makes it unlawful for the Post Office voluntarily to comply with a request from the police to make and supply records of metering (see paragraph 56 above). The practice described above, including the limitative conditions as to when the information may be provided, has been made public in answer to parliamentary questions (ibid.). However, on the evidence adduced before the Court, apart from the simple absence of prohibition, there would appear to be no legal rules concerning the scope and manner of exercise of the discretion enjoyed by the public authorities. Consequently, although lawful in terms of domestic law, the interference resulting from the existence of the practice in question was not ‘in accordance with the law’, within the meaning of paragraph 2 of Article 8 (art. 8-2) (see paragraphs 66 to 68 above).
In 1990, in Huvig & Kruslin v. France,8 the ECtHR determined that Article 8 was breached. The case concerned the interception of telephone conversations, as in the Malone case. The Court extensively refers to its contentions in the Malone judgment as to the requirement of such interceptions being ‘in accordance with the law’. It then states:
35. Above all, the system does not for the time being afford adequate safeguards against various possible abuses. For example, the categories of people liable to (p.121) have their telephones tapped by judicial order and the nature of the offences which may give rise to such an order are nowhere defined. Nothing obliges a judge to set a limit on the duration of telephone tapping. Similarly unspecified are the procedure for drawing up the summary reports containing intercepted conversations; the precautions to be taken in order to communicate the recordings intact and in their entirety for possible inspection by the judge (who can hardly verify the number and length of the original tapes on the spot) and by the defence; and the circumstances in which recordings may or must be erased or the tapes be destroyed, in particular where an accused has been discharged by an investigating judge or acquitted by a court. The information provided by the Government on these various points shows at best the existence of a practice, but a practice lacking the necessary regulatory control in the absence of legislation or case-law.
36. In short, French law, written and unwritten, does not indicate with reasonable clarity the scope and manner of exercise of the relevant discretion conferred on the public authorities. This was truer still at the material time, so that Mr Kruslin did not enjoy the minimum degree of protection to which citizens are entitled under the rule of law in a democratic society (see the Malone judgment previously cited, Series A no. 82, p. 36, § 79). There has therefore been a breach of Article 8 (art. 8) of the Convention.
220.127.116.11 Pre-crime surveillance (including surveillance by the intelligence services)
In 1978, in Klass v. Germany,9 the ECtHR decided a case regarding surveillance measures taken by the secret services in Germany. I will quote the most relevant considerations from the judgment, which should clarify how the Court argues points of law and thus shapes the interpretation of legal conditions:
All five applicants claim that Article 10 para. 2 of the Basic Law (Grundgesetz) and a statute enacted in pursuance of that provision, namely the Act of 13 August 1968 on Restrictions on the Secrecy of the Mail, Post and Telecommunications (… hereinafter referred to as ‘the G 10’), are contrary to the Convention.
They do not dispute that the State has the right to have recourse to the surveillance measures contemplated by the legislation; they challenge this legislation in that it (p.122) permits those measures without obliging the authorities in every case to notify the persons concerned after the event, and in that it excludes any remedy before the courts against the ordering and execution of such measures.
Their application is directed against the legislation as modified and interpreted by the Federal Constitutional Court (Bundesverfassungsgericht).
The Court first discusses the admissibility of the complaint, raising the question whether the applicant is a victim of violation by one of the MSs.
33. ( … ) Article 25 (art. 25) [now Article 34, mh] does not institute for individuals a kind of actio popularis for the interpretation of the Convention; it does not permit individuals to complain against a law in abstracto simply because they feel that it contravenes the Convention. In principle, it does not suffice for an individual applicant to claim that the mere existence of a law violates his rights under the Convention; it is necessary that the law should have been applied to his detriment.
34. ( … ) The question arises in the present proceedings whether an individual is to be deprived of the opportunity of lodging an application with the Commission because, owing to the secrecy of the measures objected to, he cannot point to any concrete measure specifically affecting him. ( … )
36. The Court points out that where a State institutes secret surveillance the existence of which remains unknown to the persons being controlled, with the effect that the surveillance remains unchallengeable, Article 8 (art. 8) could to a large extent be reduced to a nullity. It is possible in such a situation for an individual to be treated in a manner contrary to Article 8 (art. 8), or even to be deprived of the right granted by that Article (art. 8), without his being aware of it and therefore without being able to obtain a remedy either at the national level or before the Convention institutions. ( … ) The Court finds it unacceptable that the assurance of the enjoyment of a right guaranteed by the Convention could be thus removed by the simple fact that the person concerned is kept unaware of its violation. ( … )
38. Having regard to the specific circumstances of the present case, the Court concludes that each of the applicants is entitled to ‘(claim) to be the victim of a violation’ of the Convention, even though he is not able to allege in support of his application that he has been subject to a concrete measure of surveillance.
This entails that the Court makes an exception to the requirement that applicants must claim and demonstrate to be a victim of violation in concrete terms. Depending on the specific circumstances of the case at hand, the Court may decide to conduct an abstract test of relevant legislation, attributing the (p.123) status of ‘victims’ of what is now Article 34 ECHR, to those who may have been a victim of secret surveillance measures.
The Court then quotes relevant legislation, notably Article 10 of the Basic Law of Germany:
(1) Secrecy of the mail, post and telecommunications shall be inviolable.
(2) Restrictions may be ordered only pursuant to a statute. Where such restrictions are intended to protect the free democratic constitutional order or the existence or security of the Federation or of a Land, the statute may provide that the person concerned shall not be notified of the restriction and that legal remedy through the courts shall be replaced by a system of scrutiny by agencies and auxiliary agencies appointed by the people’s elected representatives.
The Court begins by investigating whether the legislation that is contested by the applicants, constitutes an interference with Article 8.1 ECHR:
41. The first matter to be decided is whether and, if so, in what respect the contested legislation, in permitting the above-mentioned measures of surveillance, constitutes an interference with the exercise of the right guaranteed to the applicants under Article 8 para. 1 (art. 8-1). ( … )
Furthermore, in the mere existence of the legislation itself there is involved, for all those to whom the legislation could be applied, a menace of surveillance; this menace necessarily strikes at freedom of communication between users of the postal and telecommunication services and thereby constitutes an ‘interference by a public authority’ with the exercise of the applicants’ right to respect for private and family life and for correspondence.
As is often the case, the Court takes a broad view of the scope of the first paragraph and decides that the legislation constitutes an infringement. The next question is whether the infringement is justified:
42. The cardinal issue arising under Article 8 (art. 8) in the present case is whether the interference so found is justified by the terms of paragraph 2 of the Article (art. 8-2).
The Court first tests whether the infringement is ‘in accordance with the law’:
This requirement is fulfilled in the present case since the ‘interference’ results from Acts passed by Parliament, including one Act which was modified by the Federal Constitutional Court, in the exercise of its jurisdiction, by its judgment of 15 December 1970 (see paragraph 11 above).
In addition, the Court observes that, as both the Government and the Commission pointed out, any individual measure of surveillance has to comply with the strict conditions and procedures laid down in the legislation itself.
This leads the Court to test whether the interference has a legitimate aim:
45. The G 10 defines precisely, and thereby limits, the purposes for which the restrictive measures may be imposed. It provides that, in order to protect against ‘imminent dangers’ threatening ‘the free democratic constitutional order’, ‘the existence or security of the Federation or of a Land’, ‘the security of the (allied) armed forces’ stationed on the territory of the Republic or the security of ‘the troops of one of the Three Powers stationed in the Land of Berlin’, the responsible authorities may authorise the restrictions referred to above (see paragraph 17).
46. The Court, sharing the view of the Government and the Commission, finds that the aim of the G 10 is indeed to safeguard national security and/or to prevent disorder or crime in pursuance of Article 8 para. 2 (art. 8-2). In these circumstances, the Court does not deem it necessary to decide whether the further purposes cited by the Government are also relevant.
This brings the Court to test the final criterion of the triple test, investigating whether the interference is necessary in a democratic society. Below you will find an extensive quotation of (part) of the reasoning of the Court regarding the question whether the interference enabled by the legislation is proportional, considering what is at stake.
47. The applicants do not object to the German legislation in that it provides for wide-ranging powers of surveillance; they accept such powers, and the resultant encroachment upon the right guaranteed by Article 8 para. 1 (art. 8-1), as being a necessary means of defence for the protection of the democratic State.
The applicants consider, however, that paragraph 2 of Article 8 (art. 8-2) lays down for such powers certain limits which have to be respected in a democratic society in order to ensure that the society does not slide imperceptibly towards totalitarianism. In their view, the contested legislation lacks adequate safeguards against possible abuse. (p.125)
49. As concerns the fixing of the conditions under which the system of surveillance is to be operated, the Court points out that the domestic legislature enjoys a certain discretion. It is certainly not for the Court to substitute for the assessment of the national authorities any other assessment of what might be the best policy in this field ( … )
Nevertheless, the Court stresses that this does not mean that the Contracting States enjoy an unlimited discretion to subject persons within their jurisdiction to secret surveillance. The Court, being aware of the danger such a law poses of undermining or even destroying democracy on the ground of defending it, affirms that the Contracting States may not, in the name of the struggle against espionage and terrorism, adopt whatever measures they deem appropriate.
51. According to the G 10, a series of limitative conditions have to be satisfied before a surveillance measure can be imposed. ( … )
52. The G 10 also lays down strict conditions with regard to the implementation of the surveillance measures and to the processing of the information thereby obtained. ( … )
53. Under the G 10, while recourse to the courts in respect of the ordering and implementation of measures of surveillance is excluded, subsequent control or review is provided instead, in accordance with Article 10 para. 2 of the Basic Law, by two bodies appointed by the people’s elected representatives, namely, the Parliamentary Board and the G 10 Commission. ( … )
54. The Government maintain that Article 8 para. 2 (art. 8-2) does not require judicial control of secret surveillance and that the system of review established under the G 10 does effectively protect the rights of the individual. The applicants, on the other hand, qualify this system as a ‘form of political control’, inadequate in comparison with the principle of judicial control which ought to prevail.
It therefore has to be determined whether the procedures for supervising the ordering and implementation of the restrictive measures are such as to keep the ‘interference’ resulting from the contested legislation to what is ‘necessary in a democratic society’.
55. Review of surveillance may intervene at three stages: when the surveillance is first ordered, while it is being carried out, or after it has been terminated. As regards the first two stages, the very nature and logic of secret surveillance dictate that not only the surveillance itself but also the accompanying review should be effected without the individual’s knowledge. (p.126)
Consequently, since the individual will necessarily be prevented from seeking an effective remedy of his own accord or from taking a direct part in any review proceedings, it is essential that the procedures established should themselves provide adequate and equivalent guarantees safeguarding the individual’s rights.
In addition, the values of a democratic society must be followed as faithfully as possible in the supervisory procedures if the bounds of necessity, within the meaning of Article 8 para. 2 (art. 8-2), are not to be exceeded.
One of the fundamental principles of a democratic society is the rule of law, which is expressly referred to in the Preamble to the Convention (see the Golder judgment of 21 February 1975, Series A no. 18, pp. 16–17, para. 34). The rule of law implies, inter alia, that an interference by the executive authorities with an individual’s rights should be subject to an effective control which should normally be assured by the judiciary, at least in the last resort, judicial control offering the best guarantees of independence, impartiality and a proper procedure.
56. The Court considers that, in a field where abuse is potentially so easy in individual cases and could have such harmful consequences for democratic society as a whole, it is in principle desirable to entrust supervisory control to a judge.
Nevertheless, having regard to the nature of the supervisory and other safeguards provided for by the G 10, the Court concludes that the exclusion of judicial control does not exceed the limits of what may be deemed necessary in a democratic society.
58. In the opinion of the Court, it has to be ascertained whether it is even feasible in practice to require subsequent notification in all cases.
The activity or danger against which a particular series of surveillance measures is directed may continue for years, even decades, after the suspension of those measures.
Subsequent notification to each individual affected by a suspended measure might well jeopardise the long-term purpose that originally prompted the surveillance. Furthermore, as the Federal Constitutional Court rightly observed, such notification might serve to reveal the working methods and fields of operation of the intelligence services and even possibly to identify their agents.
In the Court’s view, in so far as the ‘interference’ resulting from the contested legislation is in principle justified under Article 8 para. 2 (art. 8-2) (see paragraph 48 above), the fact of not informing the individual once surveillance has ceased cannot itself be incompatible with this provision since it is this very fact which ensures the efficacy of the ‘interference’. (p.127)
For these reasons the Court
1. holds unanimously that it has jurisdiction to rule on the question whether the applicants can claim to be victims within the meaning of Article 25 (art. 25) of the Convention;
2. holds unanimously that the applicants can claim to be victims within the meaning of the aforesaid Article (art. 25);
3. holds unanimously that there has been no breach of Article 8, Article 13 or Article 6 (art. 8, art. 13, art. 6) of the Convention.
In 2006, the ECtHR decided the case of Weber & Saravia v. Germany,10 once again testing legislation regarding so-called ‘strategic monitoring’ by intelligence services. In this case, the Court specifies in more detail what qualifies as ‘interferences’ that are ‘in accordance with the law’. Although, after having conducted the triple test, the Court decided that the contested legislation did not violate Article 8 ECHR, I will quote the legal conditions summed up by the Court to attain the legal effect of such interferences qualifying as being ‘in accordance with the law’.
95. In its case-law on secret measures of surveillance, the Court has developed the following minimum safeguards that should be set out in statute law in order to avoid abuses of power:
• the nature of the offences which may give rise to an interception order;
• a definition of the categories of people liable to have their telephones tapped;
• a limit on the duration of telephone tapping;
• the procedure to be followed for examining, using and storing the data obtained;
• the precautions to be taken when communicating the data to other parties; and
• the circumstances in which recordings may or must be erased or the tapes destroyed.
(p.128) Since 2006, a number of cases have been decided on the issue of surveillance, either in the context of post-crime or pre-crime measures, as well as measures taken by the intelligence services.11 This includes both concrete interferences and legislation that would enable such interferences. As recounted above, the latter is not normally open to scrutiny by the Court, as it concerns an abstract test of the compatibility of domestic law against the Convention. The Court, however, can make an exception when applicants claim that the nature of the legislation or practice is such that they cannot know whether or not they have been a victim of state surveillance.
With the above analyses that closely follow the reasonings of the Court, the readers should have sufficient analytical instruments to study, for instance, the case of Big Brother Watch and Others v. the United Kingdom of 2018.12 This case regards complaints about the compatibility with Article 8 ECHR of three discrete regimes of mass surveillance in the United Kingdom. First, the regime for the bulk interception of communications under section 8(4) of the Regulation of Investigatory Powers Act 2000 (RIPA); the UK–US intelligence sharing regime applied by the security service (MI5), the secret intelligence service (MI6), and the Government Communications Headquarters (GCHQ, which covers information and signals intelligence or ‘sigint’); and the regime for the acquisition of communications data under Chapter II of RIPA. The purpose of this work is not to provide an exhaustive overview of positive law in the realm of the right to privacy, but to provide computer scientists and students of computer science with a proper understanding of law as a scholarly discipline and a professional practice. In the end, the proof of the pudding will be in the eating. The reader is invited and encouraged to have their own tastings of legal texts, discovering the major impact of legal decision-making on potential violations of, for example, the right to privacy.
5.4 Privacy and Data Protection
Since the CFREU (or ‘the Charter’) has been in force (2009), the EU ‘has’ two fundamental rights regarding the processing of personal data:
Article 7 Respect for private and family life
Everyone has the right to respect for his or her private and family life, home and communications.
Article 8 Protection of personal data
1. Everyone has the right to the protection of personal data concerning him or her.
2. Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.
3. Compliance with these rules shall be subject to control by an independent authority.
Article 52 of the Charter clarifies the relationship between Article 7 of the Charter and Article 8 ECHR, which both refer to the right to privacy.
3. In so far as this Charter contains rights which correspond to rights guaranteed by the Convention for the Protection of Human Rights and Fundamental Freedoms, the meaning and scope of those rights shall be the same as those laid down by the said Convention. This provision shall not prevent Union law providing more extensive protection.
This stipulates that Article 7 CFREU cannot be interpreted as providing less protection compared to Article 8 ECHR, but may be interpreted as attributing additional protection. To the extent that Article 8 CFREU corresponds to Article 8 ECHR, it can—similarly—not be interpreted as providing less protection than Article 8 ECHR, but it may provide additional protection.
5.4.1 Defaults: an opacity right and a transparency right
Some authors have argued that whereas, by default, the right to privacy is foremost an opacity right, data protection is foremost a transparency right. As an (p.130) opacity right, the right to privacy aims to safeguard a private sphere for individual citizens, where they can basically ward-off interference by others, most notably the state. This highlights the idea that privacy is a liberty right, a negative right that obligates others to refrain from interference with the good that is protected. As a transparency right, the right to data protection aims to ensure that whenever personal data is processed (which included collection, access, manipulation, and any other usage) such processing must be done in a transparent manner, in compliance with a set of conditions which should ensure fair and lawful processing.
Note that the opacity concerns the private sphere of an individual person, whereas the transparency concerns the state and other powerful actors when processing personal data. This accords with the core tenets of the Rule of Law, which hold that whereas government should be as transparent as possible, citizens should be shielded from intrusive transparency by the government.
5.4.2 Distinctive but overlapping rights: a Venn diagram
Though one may be tempted to see the right to data protection as a subset of the right to privacy, this is not correct. Within the context of the EU, the right to privacy entails both more and less than the right to data protection. We portray this in Figure 5.1 below. (p.131)
Note that if such data are subsequently used for other purposes, for example, to support the business model of a webshop by way of targeted advertising, privacy may be at stake. Whether or not this is the case also relates to the fact that the right to privacy, as discussed above, is primarily at stake in the vertical relationship between a government and its citizens, whereas the right to data protection seems to be applicable to all those who process personal data. This is certainly the case for data processing that falls under the scope of the GDPR.
5.4.3 Legal remedies in case of violation
The right to privacy can be invoked in a national court of law, for instance in the course of criminal or administrative proceedings. As discussed above, individual citizens have a right to present their claim to the ECtHR, which resides in Strasbourg, but this can only be done after exhausting national remedies. That means that if one fails to claim violation of Article 8 ECHR at the national level, or if one fails to appeal against a judgment that denies such a violation, the application to the ECtHR will be inadmissible. See Articles 34 and 35 ECHR:
Article 34 Individual applications
The Court may receive applications from any person, nongovernmental organisation or group of individuals claiming to be the victim of a violation by one of the High Contracting Parties of the rights set forth in the Convention or the Protocols thereto. The High Contracting Parties undertake not to hinder in any way the effective exercise of this right. (p.132)
Article 35 Admissibility criteria
The Court may only deal with the matter after all domestic remedies have been exhausted, according to the generally recognised rules of international law, and within a period of six months from the date on which the final decision was taken.
Both the right to privacy and the right to data protection of the CFREU have direct application in the MSs of the EU. This means one can invoke them in a national court of law. If, however, a question is raised about the interpretation of the Charter, Article 267 of the Treaty on the Functioning of the EU (TFEU) stipulates that so-called ‘preliminary questions’ can, or must, be referred to the CJEU (which resides in Luxembourg):
The Court of Justice of the European Union shall have jurisdiction to give preliminary rulings concerning:
(a) the interpretation of the Treaties;
(b) the validity and interpretation of acts of the institutions, bodies, offices or agencies of the Union;
Where such a question is raised before any court or tribunal of a Member State, that court or tribunal may, if it considers that a decision on the question is necessary to enable it to give judgment, request the Court to give a ruling thereon.
Where any such question is raised in a case pending before a court or tribunal of a Member State against whose decisions there is no judicial remedy under national law, that court or tribunal shall bring the matter before the Court.
( … )
5.5 Data Protection Law
The history of data protection law goes back to the 1970s, when various countries enacted legislation to ensure fair processing of personal information by the government. An early example was the US Privacy Act of 1974,13 which instigated a set of fair practices for dealing with personal information.
(p.133) In 1980, the global Organisation of Economic Co-operation and Development (OECD) issued the so-called ‘Fair Information Principles’ (FIPs), as part of the (non-binding) Guidelines governing the protection of privacy and transborder flows of personal data:
Collection Limitation Principle
7. There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.
Data Quality Principle
8. Personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date.
Purpose Specification Principle
9. The purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.
Use Limitation Principle
10. Personal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with Paragraph 9 except:
a) with the consent of the data subject; or
b) by the authority of law.
Security Safeguards Principle
11. Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorised access, destruction, use, modification or disclosure of data.
12. There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller.
Individual Participation Principle
13. Individuals should have the right:
a) to obtain from a data controller, or otherwise, confirmation of whether or not the data controller has data relating to them;
i. within a reasonable time;
ii. at a charge, if any, that is not excessive;
iii. in a reasonable manner; and
iv. in a form that is readily intelligible to them;
c) to be given reasons if a request made under subparagraphs (a) and (b) is denied, and to be able to challenge such denial; and
d) to challenge data relating to them and, if the challenge is successful to have the data erased, rectified, completed or amended.
14. A data controller should be accountable for complying with measures which give effect to the principles stated above.
The version quoted has been taken from the updated Guidelines of 2013. The update does not concern the FIPs themselves, but aims to strengthen worldwide enforcement and accountability. With an eye to the increased scale of data processing and the new techniques for data analytics, the OECD recommends a risk-based approach that is proactive rather than reactive when it comes to the rights and freedoms of those affected by the processing of personal data.
5.5.1 EU and US data protection law
In the United States, data protection is part of the right to privacy (in Constitutional and tort law) and subject to sectorial legislation, notably with regard to finance, healthcare, special protection of children, and consumer protection. There is no general law on data protection, apart from the 1974 Privacy Act (which only applies to Federal Agencies). This means that the protection of personal data varies with the context of processing. In commercial contexts, much of the actual protection depends on the competences of the Federal Trade Commission (FTC), based on section 5 of the FTC Act:
(1) Unfair methods of competition in or affecting commerce, and unfair or deceptive acts or practices in or affecting commerce, are hereby declared unlawful.
(2) The Commission is hereby empowered and directed to prevent persons, partnerships, or corporations, [except certain specified financial and industrial sectors] from using unfair methods of competition in or affecting commerce and unfair or deceptive acts or practices in or affecting commerce.
In the EU, the situation is altogether different, due to the general applicability of EU data protection law, which does not depend on whether a violation can be framed as ‘an unfair or deceptive act in or affecting commerce’. In the next subsection, we will provide an extensive discussion of the core content of EU data protection law.
5.5.2 EU data protection law
The GDPR is based on Article 16 TFEU, which reads:
1. Everyone has the right to the protection of personal data concerning them.
2. The European Parliament and the Council, acting in accordance with the ordinary legislative procedure, shall lay down the rules relating to the protection of individuals with regard to the processing of personal data by Union institutions, bodies, offices and agencies, and by the Member States when carrying out (p.136) activities which fall within the scope of Union law, and the rules relating to the free movement of such data. Compliance with these rules shall be subject to the control of independent authorities.
( … )
The GDPR protects the fundamental right to data protection as stipulated in Article 8 CFREU. However, the GDPR goes beyond this, explicitly aiming to protect all the fundamental rights and freedoms that are implicated by the processing of personal data. But this is not the only goal of the Regulation. At the same time, the Regulation aims to prevent that different levels of data protection within the jurisdiction of the MSs result in obstructions of the internal market. So, harmonization of protection to ensure a free flow of personal data is the second, equally important, goal of the GDPR:
Article 1 Subject-matter and objectives
1. This Regulation lays down rules relating to the protection of natural persons with regard to the processing of personal data and rules relating to the free movement of personal data.
2. This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data.
3. The free movement of personal data within the Union shall be neither restricted nor prohibited for reasons connected with the protection of natural persons with regard to the processing of personal data.
18.104.22.168 Sources of law regarding EU data protection law
So far, we have seen that the sources of law consist of legislation and treaties, case law, doctrine, customary law, and fundamental principles. In the case (p.137) of EU data protection law, we have the founding Treaties,14 the Charter, the GDPR, the Police Data Protection Directive (PDPD),15 the ePrivacy Directive (ePD),16 and a whole series of other Regulations and Directives that may at some point be relevant (but will not be discussed here). Next to this we have the case law of the CJEU regarding data protection issues, decisions and policies of the supervisory authorities in the MSs and the European Data Protection Supervisor, and we have doctrinal treatises and journal articles which analyse and discuss the legislation, the case law, and the underlying principles and practices.
In the case of EU data protection law, we have one more source of law, which has played an important role in the interpretation of the former DPD: the Opinions and Guidelines of the independent Article 29 Working Party (Art. 29 WP). This was the advisory body (instituted by Article 29 DPD) that produced a great number of highly relevant interpretations of EU data protection law, which continue to function as an important source of law. Though its output was not binding, it has persuasive authority based on the experience and expertise of its members (the data protection supervisors of the MSs) and based on its official task, which was to advise on proper implementation of EU data protection law. Most of the Opinions, Guidelines and Recommendations of the Art. 29 WP are equally relevant under the GDPR, as the core principles and concepts have not changed.
The material scope of the GDPR is limited to ‘the processing of personal data’ (Article 2.1). The definition of ‘processing’, however, is very broad, as Article 4(2) reads:
‘processing’ means any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction
The GDPR does not apply to the processing of personal data within the context of a household and it does not apply to processing of personal data in the context of the prevention and prosecution of crime and threats to public security.18 The household exception will usually exempt the users of social networks, but not the providers (see section 22.214.171.124). With regard to the prevention and prosecution of crime, the PDPD is in force, based on Article 39 of the Treaty of the European Union (TEU).19 Since the EU has no competence regarding public security (intelligence services), there is no EU legislation as to the processing of personal data in the context of threats to public security. Note that the ECHR does apply to issues of public security, so insofar as privacy is infringed, measures can be tested against Article 8 ECHR (see section 5.3.5, notably the cases of Klass and Weber & Saravia).
Next to the exemptions of Article 2, Article 33 states that MSs may enact legislation to restrict the applicability of specific GDPR provisions, if they regard measures that are necessary in a democratic society, targeting a limited set of goals, such as national security, defence, public security, the prevention, investigation, detection, and prosecution of criminal offences, or of breaches of ethics for regulated professions, an important object of general public interest of a MS or of the EU, including monetary, budgetary, and taxation matters. Note that though restrictions based on these goals are allowed if they pass the proportionality test (‘necessary in a democratic society’ clearly refers to Article 8.2 ECHR), they also require a basis in law. (p.139) Any such restrictions are only valid insofar as they respect the essence of the fundamental rights and freedoms.
The territorial scope of the GDPR is defined in Article 3:
1. This Regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union, regardless of whether the processing takes place in the Union or not.
Bear in mind that if a tech company has an establishment in the EU, the GDPR applies to the processing of personal data, even if the processing takes place elsewhere. At some point a tech company relocated its headquarters from Ireland to the United States, because otherwise data subjects in countries outside the EU could appeal to the Irish data protection supervisor under the GDPR.
2. This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to:
(a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union; or
(b) the monitoring of their behaviour as far as their behaviour takes place within the Union.
126.96.36.199 Personal data and data subject
Article 4(1) GDPR clarifies that:
‘personal data’ means:
• any information
• relating to
• an identified or
• identifiable natural person (‘data subject’)
• one who can be identified,
• directly or indirectly,
• an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity;
Many authors have pointed out that this entails a very broad view of ‘personal data’, potentially bringing nearly any data under the heading of personal data. This is especially the case as the combination of increased availability and increased searchability and linkability of massive amounts of data will enable identification and re-identification of data that would previously not have been considered personal data. At some point, data about the weather, about room temperature, about the arrival of a train may become personal data, when it can be related to a person that can be singled out. Recital 26 reads:
To determine whether a natural person is identifiable, account should be taken of all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly.
The criterion to determine whether data is personal, that is ‘identifiable’, is that it is ‘reasonably likely’ that a person can, for example, be singled out. The recital continues:
To ascertain whether means are reasonably likely to be used to identify the natural person, account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments.
In the case of Breyer v. Germany,20 the CJEU decided that even a dynamic IP address may qualify as a personal data, depending on whether the link with a (p.141) specific person can be made. The case concerned government websites that processed dynamic IP addresses, keeping them longer than was necessary for providing access to the sites. What made this case special is that the ability to link the IP address to a specific person was not in the hands of the operators of the government website but in the hand of internet service providers (ISPs). The CJEU found that because ISPs could be ordered by a court to provide information about the user of a dynamic IP addresses, this IP address should not be considered anonymous.
The material scope of the GDPR regards (as discussed in section 188.8.131.52) the processing of personal data. This implies that to avoid applicability of the GDPR, one could ‘simply’ anonymize previously personal data. There are two caveats here. First, anonymization is itself a form of processing, and thereby requires a valid legal ground (see section 184.108.40.206). Second, anonymization is not easy, because the risk of re-identification easily turns ‘anonymous’ data into identifiable and thus personal data. In practice, anonymization will often remove so much information from the data that it is no longer relevant for the purpose of processing. To better understand the difference between personal and anonymized data, we can best check the definition of ‘pseudonymization’ of Article 4(5):
• the processing of personal data in such a manner that
• the personal data can no longer be attributed to a specific data subject
• without the use of additional information,
• provided that such additional information is kept separately and
• is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person;
First, we see that pseudonymous data is defined as a subset of personal data. Second, it is defined as data from which any identifying information has been removed and stored separately, subject to technical and organizational measures that resist re-identification.
220.127.116.11 Data controller and data processor
Article 4(7) GDPR defines ‘controller’ as:
• the natural or legal person, public authority, agency or any other body
• which alone or jointly with others
• determines the purposes and means of the processing of personal data;
The definition of ‘data controller’ is crucial, because the ‘data controller’ is both accountable and liable for compliance with all the obligations of the GDPR, including obligations to implement a proactive approach to potential risks to the fundamental rights and freedoms of data subjects. The ‘data controller’ is basically defined as whoever determines the purpose of processing, whereby the CJEU checks who determines such purpose in practice, not merely on paper. The ‘data controller’ also determines the means of processing, but this can be outsourced to a data processor, defined as (Article 4(8)):
• a natural or legal person, public authority, agency or any other body
• which processes personal data on behalf of the controller;
Here, we clearly see that the data controller remains accountable for the choice of the means of processing, even if that choice is made by a processor. When the landmark case on the so-called ‘right to be forgotten’ was decided in 2014 (Google Spain v, Costeja),21 one of the most important issues was whether Google should be qualified as a data controller or a data processor. Google had argued that its search engine has no other function than to provide its users with automatically generated search results, thereby claiming that it is the user, not the service provider who determines the purpose of the processing. Google argued that its search engine is merely a choice of means (the PageRank algorithm) employed in the service of users that decide the purpose of the search. The highest adviser of the CJEU (the Court), holding the office of Advocate General (AG), who is required to provide a so-called ‘Opinion’ (advise) to the Court, had taken the position that in the case of a search engine, the service provider is indeed a data processor, not the controller. Surprisingly, (p.143) the Court (that is not bound by the Opinion of the AG), took another position, based on the fact that Google Spain (the subsidiary that sells advertising space on the search engine’s pages directed to Spanish users) has its own business model and thereby determines the purpose of processing. If the Court had not qualified Google Spain as a data controller, it could never have required Google to de-list the news item that Costeja wished to have erased.
Another pivotal case of 2018 concerned the fanpage of Wirtschaftsakademie,22 used to provide services in the realm of education. The fanpage was hosted on Facebook, which enabled the operator to obtain anonymous statistical details on website visitors via the ‘Facebook Insights’ function, which Facebook offers free of charge under non-negotiable conditions. The CJEU decided that the operator of the fanpage was a joint controller, together with Facebook, as the statistics were obtained by processing cookies placed on the terminal equipment of the visitors. Since the purpose of the processing of such cookies is co-decided by the fanpage operator, even though it has no control over the data processing and was not given access to the data, they are jointly responsible for the necessary processing of personal data. Under the GDPR this would be based on Article 26, which reads:
Where two or more controllers jointly determine the purposes and means of processing, they shall be joint controllers. They shall in a transparent manner determine their respective responsibilities for compliance with the obligations under this Regulation, in particular as regards the exercising of the rights of the data subject and their respective duties to provide the information ( … ).
The processing of personal data is only allowed on the basis of one of six legal grounds. Please take note of the fact that consent is just one of those legal grounds and not necessarily the most obvious. Article 6 GDPR reads:
a) the data subject has given consent to the processing of his or her personal data for one or more specific purposes;
Under the DPD ‘for one or more specific purposes’ was not explicitly mentioned, though it was obvious from requirements detailed elsewhere in the DPD. Under the GDPR, it is explicitly clear that consent is only valid if the purpose has been specified. As Article 5 stipulates that data may only processed if necessary for the specified purpose, this means that consent can only concern the processing of personal data that is necessary for the purpose that was communicated. All the other grounds stipulate that the processing must be necessary in relation to the ground.
Valid consent will be further discussed in a dedicated section (section 18.104.22.168).
b) processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;
This entails that once the contract has been concluded and performed and the data is no longer necessary (goods or service delivered, invoice paid), it may no longer be processed on this ground. Further processing will require another ground, for example, consent (for another purpose).
c) processing is necessary for compliance with a legal obligation to which the controller is subject;
Much processing is mandatory due to legal obligations, such as processing by the tax authority, social security agency, land registry, or by commercial enterprise that must, for example, comply with employment, social security, and tax legislation. Article 6.3 stipulates that this processing must be based on MS or Union law, must contain the specific purpose(s) of processing, and must have relevant limitations and safeguards.
d) processing is necessary in order to protect the vital interests of the data subject or of another natural person;
e) processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;
This ground is comparable to the c-ground, but here there may not be a legal obligation but a legal competence or task that requires the processing of personal data. We can think of processing by various types of government agencies that provide support to those in need, or need to collect information on energy usage to develop policies on the reduction of energy consumption. Note that to the extent that such information can rely on aggregated or otherwise anonymized data, the processing of personal data is not necessary and cannot be based on this ground. Article 6.3 stipulates that this processing must be based on MS or Union law, must contain the specific purpose(s) of processing, and must have relevant limitations and safeguards.
f) processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.
Point (f) of the first subparagraph shall not apply to processing carried out by public authorities in the performance of their tasks.
The f-ground is important for processing carried out by the commercial sector, including financial institutions, social networks, and search engines, and we may expect that added value service providers in the context of smart homes, smart grids, and connected cars will base the processing of data that is not necessary for the primary process (which will often be based on contract) on the f-ground. As the economic interests of a business, including its competitive edge and innovative potential, often depend on advertising revenue and/or the sale of personal data or inferred profiles, the f-ground is a tempting basis insofar as other grounds do not apply.
The balancing test required of the controller, entails the following considerations:24
the nature and source of the legitimate interest and whether the data processing is necessary for the exercise of a fundamental right, is otherwise in the public interest, or benefits from recognition in the community concerned;
the impact on the data subject and their reasonable expectations about what will happen to their data, as well as the nature of the data and how they are processed;
additional safeguards which could limit undue impact on the data subject, such as data minimisation, privacy-enhancing technologies; increased transparency, general and unconditional right to opt-out, and data portability.
First, the Courts looks into the economic interest of Google Spain in sustaining its business model, because the right to erase and the right to object (p.147) that Costeja invoked would involve costs on the side of Google (especially because many others may similarly submit requests to de-reference). The CJEU found that:
81 In the light of the potential seriousness of that interference, it is clear that it cannot be justified by merely the economic interest which the operator of such an engine has in that processing. ( … )
The seriousness of the interference, in this case, was argued in considerations 37 and 38:
37 ( … ) the organisation and aggregation of information published on the internet that are effected by search engines with the aim of facilitating their users’ access to that information may, when users carry out their search on the basis of an individual’s name, result in them obtaining through the list of results a structured overview of the information relating to that individual that can be found on the internet enabling them to establish a more or less detailed profile of the data subject.
38 Inasmuch as the activity of a search engine is therefore liable to affect significantly, and additionally compared with that of the publishers of websites, the fundamental rights to privacy and to the protection of personal data, ( … ).
Second, the Court considered the legitimate interests of users of the search engine in having access to the search result that may be de-referenced:
81 ( … ) However, inasmuch as the removal of links from the list of results could, depending on the information at issue, have effects upon the legitimate interest of internet users potentially interested in having access to that information, in situations such as that at issue in the main proceedings a fair balance should be sought in particular between that interest and the data subject’s fundamental rights under Articles 7 and 8 of the Charter. Whilst it is true that the data subject’s rights protected by those articles also override, as a general rule, that interest of internet users, that balance may however depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life.
Here we see a clash between the freedom of information of search engine users and the right to data protection of the data subject, which requires some subtle (p.148) balancing. Note, however, that the Court is not discussing the removal of content from the internet, but the de-referencing of a search result that links to such content.
22.214.171.124 Principles of lawful, fair, and transparent processing
Next to, and thus on top of, having a legal ground, Article 5 GDPR stipulates a set of rules under the heading of ‘Principles relating to the processing of personal data’. Though the use of the term ‘principles’ could suggest that these are just some underlying assumptions, they are in fact rules that must be complied with. We will follow the wording of the article, discussing each paragraph along the way (the principles in bold are part of the article, emphasis is mine):
(a) processed lawfully, fairly and in a transparent manner in relation to the data subject (‘lawfulness, fairness and transparency’);
Though one may think that lawfulness merely refers to Article 6, which contains the legal basis, the term ‘lawfulness’ also refers to the bigger picture of the rule of law, as with the requirement that infringements of the right to privacy under Article 8 ECHR must be ‘in accordance with the law’. This means that a mere basis in law is not enough and must be understood in qualitative terms to include respect for legitimate expectations, independent oversight, and other checks and balances to ensure that the legal basis of Article 6 is valid (see also Article 6.3). Similarly, fairness refers to various balancing and proportionality tests, taking note of the relevant interests and fundamental rights that are at stake. Transparency is further detailed in Articles 13, 14, and 15 GDPR.
(b) collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes; further processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes shall, in accordance with Article 89(1), not be considered to be incompatible with the initial purposes (‘purpose limitation’);
Further processing for another purpose is allowed if the purpose is not incompatible with the initial purpose, as communicated to the data subject. To determine whether the new purpose is compatible, Article 6(4) provides the following indications: any link between the old and the new purpose, the context of collection and the relationship between controller and subject, the nature and sensitivity of the data, the potential consequences of further processing for the data subject, and the existence of appropriate safeguards, such as encryption or pseudonymization. In case of consent for the new purpose or a legal obligation that involves the new purpose, processing is based on the new ground and cannot be based on processing for a compatible purpose.
Secondary usage (further processing) for scientific or statistical research or archiving in the public interest is considered compatible by default. The GDPR contains an extensive exception for such processing in Article 89, with further exceptions for medical research in, for example, Article 9.2(h). Recital 33 furthermore indicates that ‘[i]t is often not possible to fully identify the purpose of personal data processing for scientific research purposes at the time of data collection. Therefore, data subjects should be allowed to give their consent to certain areas of scientific research when in keeping with recognised ethical standards for scientific research. Data subjects should have the opportunity to give their consent only to certain areas of research or parts of research projects to the extent allowed by the intended purpose’.
(c) adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (‘data minimisation’);
(p.150) Data minimization is another core principle, that also underlies the principles of purpose limitation and storage limitation. In the DPD, this ground was articulated as ‘adequate, relevant and not excessive’, whereas now the criterion is ‘adequate, relevant and limited to what is necessary’. This is a further restriction, moving towards strict proportionality and subsidiarity, thereby also relating to the requirement to pseudonymize or anonymize the data as soon as possible. This principle links consent to necessity, as observed above. It also connects with the right to request erasure if processing is irrelevant for the given purpose.
(d) accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay (‘accuracy’);
Here the principle of accuracy is formulated as a legal obligation of the data controller, but this connects with the rights of erasure and rectification in the case that data are inaccurate.
(e) kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed; personal data may be stored for longer periods insofar as the personal data will be processed solely for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1) subject to implementation of the appropriate technical and organisational measures required by this Regulation in order to safeguard the rights and freedoms of the data subject (‘storage limitation’);
Storage limitation basically requires that controllers engage in lifecycle management of the personal data they process, removing them, for example, when the purpose is exhausted and processing is no longer relevant. The exception for scientific research and archiving, mentioned above, requires appropriate technical and organizational safeguards, taking into account the rights and freedoms of the data subject, which will vary depending on, for example, the nature of the data.
(f) processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures (‘integrity and confidentiality’).
(p.151) This principle connects with the requirement of security by design of Article 32, and the legal obligation for controllers to notify supervisory authorities and data subjects of data breaches (Articles 33, 34).
2. The controller shall be responsible for, and be able to demonstrate compliance with, paragraph 1 (‘accountability’).
The accountability principle addresses the data controller as the focal point of responsibility, accountability, and liability regarding compliance with the principles that pervade the GDPR. Accountability is further detailed in Article 30 that requires the controller to demonstrate and document compliance, while liability is further detailed in Articles 79–83 about enforcement (including both administrative law fines and prohibitions, and private law compensation and injunctive relief). The roles and responsibilities of the controller (including joint controllers) and processor are further specified in Articles 24, 26, and 28.
126.96.36.199 Valid consent
Other than the DPD, the GDPR contains a separate article on consent. Article 7 declares, under the heading of ‘Conditions for Consent’:
1. Where processing is based on consent, the controller shall be able to demonstrate that the data subject has consented to processing of his or her personal data.
This concerns the burden of proof.
2. If the data subject’s consent is given in the context of a written declaration which also concerns other matters, the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. Any part of such a declaration which constitutes an infringement of this Regulation shall not be binding.
Note that consent may not be hidden in complicated wordy privacy policies, and must be ‘easily accessible’ as to its form (think of the user interface), ‘using clear and plain language’. If consent is part of an elaborate and incomprehensible Terms of Service that basically contains an implicit consent, such consent is not valid.
3. The data subject shall have the right to withdraw his or her consent at any time. The withdrawal of consent shall not affect the lawfulness of processing based on (p.152) consent before its withdrawal. Prior to giving consent, the data subject shall be informed thereof. It shall be as easy to withdraw as to give consent.
This means that if consent is given by ticking a box, it must be as easy to untick the box. If one has to explore every nook and corner of a website to figure out how to withdraw consent, the consent is not valid.
4. When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.
To better understand what this means, we can use recital 43:
In order to ensure that consent is freely given, consent should not provide a valid legal ground for the processing of personal data in a specific case where there is a clear imbalance between the data subject and the controller, in particular where the controller is a public authority and it is therefore unlikely that consent was freely given in all the circumstances of that specific situation. Consent is presumed not to be freely given if it does not allow separate consent to be given to different personal data processing operations despite it being appropriate in the individual case, or if the performance of a contract, including the provision of a service, is dependent on the consent despite such consent not being necessary for such performance.
Note that the legal ground must be communicated to the data subject when the processing commences (if data is collected from the data subjects, cf. Article 13), or within a reasonable time, at the latest within one month after obtaining the data (if data has not been obtained from the data subjects, cf. Article 14). Controllers cannot require consent and—after finding that the consent is not valid—claim that the processing is based on its legitimate interest; due to the (p.153) inherent logic of the different grounds, controllers cannot claim to base the same processing operations on different grounds.
188.8.131.52 Special categories of data
Article 9 defines a set of data as requiring special treatment. These data are often called ‘sensitive data’ and are defined as: ‘data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation’.
By default, the processing of such data is prohibited. Strictly defined exceptions apply, notably based on explicit consent; specific rights and obligations in the field of employment and social; the vital interests of the data subject or of another natural person; or with regard to processing in the context of not-for-profit bodies with a political, philosophical, religious, or trade union aim; processing of personal data which are manifestly made public by the data subject; processing necessary for legal claims, substantial public interest, preventive or occupational medicine, assessment of the working capacity of the employee, medical diagnosis, the provision of health or social care or treatment or the management of health or social care systems and services, for public health, for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes.
On top of that, Article 10 restricts the ‘Processing of personal data relating to criminal convictions and offences’.
Processing of personal data relating to criminal convictions and offences or related security measures based on Article 6(1) shall be carried out only under the control of official authority or when the processing is authorised by Union or Member State law providing for appropriate safeguards for the rights and freedoms of data subjects. Any comprehensive register of criminal convictions shall be kept only under the control of official authority.
This is particularly relevant if inferences are made based on machine learning or other techniques to infer patterns from big data, because such inferences (p.154) may include sensitive data. Social networks, advertising intermediaries, or criminal justice authorities may infer racial or ethnic origin, political opinion, or sexual preferences, which inferences may then be applied to identifiable persons that match the profile. Such inferencing may be inadvertent, but nevertheless result in decisions based on such inferences, for instance parole decisions based on a correlation between race and recidivism. In Chapters 10 and 11 we will return to this point when discussing machine learning and profiling, including an analysis of GDPR provisions on profiling and automated decision-making based on profiling.
184.108.40.206 Data protection by design and default (DPbDD)
In Chapter 1, notably in section 1.4, we have identified the text-driven nature of modern law, in contrast with the orality of prior normative orderings. The rise of data- and code-driven ICIs confronts the text-driven nature of the law with a number of problems. Merely writing down and enacting legal norms may not work if the defaults of the technical and organizational architecture of the onlife world generate a contradictory normativity, which renders compliance with legal norms difficult if not impossible. In other words, the technical architecture may present its users and inhabitants with a choice architecture that limits their understanding of the backend systems of the social networks they use, of their smart homes, connected cars, and more.
Article 25 GDPR requires that data controllers design the data processing operations in compliance with data protection law. Data protection by design (DPbD) may sound like Privacy by Design (PbD). However, the latter is based on an ethical duty, not necessarily on a legal obligation; PbD reflects the choice of a controller to respect the privacy of their users by way of a privacy-friendly design. Also, as privacy is not equivalent with data protection, PbD cannot be equated with DPbD, even though in practice the terminology is often used interchangeably.
DPbD is a new legal obligation (no such obligation applied under the DPD). In case of non-compliance the legal effect is liability for damages (private law liability, Article 82), unlawful processing (administrative fines, Article 83), or injunctive relief (private law injunction to stop unlawful processing with penalty payments for every day of non-compliance, Article 79).
Under the heading of ‘data protection by design and default’, Article 25 stipulates:
1. Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying (p.155) likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.
This should mitigate potential risks for the rights and freedoms of natural persons. The latter demonstrates the risk-based approach of the GDPR, which requires that controllers take a proactive approach when developing their computational backend systems. Note that Article 25 does not speak of the risks for rights and freedoms of data subjects, but of natural persons. This includes processing operations that impact other individuals, for instance when inferencing behavioural correlations that enable the influencing, exclusion, or other types of targeting of others than the data subject. Relevant design measures are, for instance pseudonymization, but one can also think of user-friendly interfaces to enable easy withdrawal of consent (Article 7.3) or subject access requests (SARs) (based on Article 15.3). Both the withdrawal of consent and SARs will involve computational architectures in the backend systems that effectively halt the processing of data for which consent has been withdrawn, or provide the data that are being processed (where Article 15.3 stipulates that if a SAR is made via electronic means, the data shall be provided in a commonly used electronic format).
2. The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the (p.156) amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.
It demands that the architecture is constructed in such a way that no additional processing takes place, beyond what is necessary for the specific purpose of the relevant processing operations. Again, compliance with this legal obligation will result in major reconfigurations of current backend systems, involving, for example, effective lifecycle management of personal data (including pseudonymization, anonymization, and deletion of data).
The third paragraph declares that an approved certification mechanism may contribute to demonstration of compliance with DPbDD.
220.127.116.11 Data protection impact assessment
DPbDD is closely related to another new compliance mechanism, the data protection impact assessment (DPIA), again exhibiting the risk-based, proactive approach that is favoured under the GDPR. Basically, controllers are obligated to assess potential violations of the GDPR when initiating new data-driven technologies. Article 35 reads under the heading of ‘data protection impact assessment’ that:
1. Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. A single assessment may address a set of similar processing operations that present similar high risks.
The criterion that decides whether a controller must conduct a DPIA is that foreseen processing operations are ‘likely to result in a high risk to the rights and freedoms of natural persons’. Again, these risks are not restricted to data subjects, but extend to all natural persons. The assessment investigates the potential impact of envisaged processing operations, which assumes that these are indeed foreseen and mapped against impact on fundamental rights.
2. The controller shall seek the advice of the data protection officer, where designated, when carrying out a data protection impact assessment.
Articles 37–39 detail which types of controller must appoint a data protection officer (DPO), under what conditions (e.g. safeguards for independence) and with what tasks. One of the tasks of the DPO is to advise on the DPIA.
3. A data protection impact assessment referred to in paragraph 1 shall in particular be required in the case of:
(a) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person;
(b) processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10; or
(c) a systematic monitoring of a publicly accessible area on a large scale.
Paragraph 3 sums up when a DPIA is mandatory, thus also giving an indication of what types of processing operations are considered high-risk.
Paragraphs 4–6 stipulate that supervisory authorities shall publish a further list of the kind of processing operations where a DPIA is mandatory, and may publish a list of processing operations where a DPIA is not mandatory. Both lists will be shared with the EDPB (which has an important advisory function as to the interpretation of the GDPR, and is further defined in Articles 68–76 GDPR).
7. The assessment shall contain at least:
(a) a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;
(b) an assessment of the necessity and proportionality of the processing operations in relation to the purposes;
(c) an assessment of the risks to the rights and freedoms of data subjects referred to in paragraph 1; and
(d) the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned.
(p.158) Paragraph 7 provides a first indication of a template for the DPIA. The listing has a high level of abstraction, thus enabling adequate concretization, depending on the types of processing operations, the context of processing, the nature of the data, and so forth. Under (d) we recognize a reference to DPbD, whose purpose is to mitigate risks to the rights and freedoms of natural persons.
Paragraph 8 states that approved codes of conduct (Article 40 GDPR) will be taken into account when assessing the impact.
9. Where appropriate, the controller shall seek the views of data subjects or their representatives on the intended processing, without prejudice to the protection of commercial or public interests or the security of processing operations.
Paragraph 9 emphasizes the need to involve those who will suffer the consequences of the intended processing, both on the side of data subjects and on the side of the controller. In earlier versions of the GDPR, the need to involve data subjects in the assessment was articulated more forcefully. One can imagine that a robust architecture will fare well based on input from those who will be effectively affected.
Paragraph 10 provides an exception for processing based on a legal obligation or a public task or authority (Article 6.1 under (c) and (e)), whenever the enactment of such an obligation has been preceded by a general DPIA on account of the legislator.
11. Where necessary, the controller shall carry out a review to assess if processing is performed in accordance with the data protection impact assessment at least when there is a change of the risk represented by processing operations.
18.104.22.168 Compliance and enforcement
The GDPR reinforces the accountability principle by initiating new legal obligations to further compliance, notably the obligation to implement DPbDD and to conduct a DPIA. Apart from those, other legal obligations require technical and organizational compliance measures, such as easy withdrawal of (p.159) consent (Article 7.3), provision of access by way of an electronic file (Article 15.3), obligations to employ pseudonymization (Articles 6.4(e), 25.1, 32.1(a), 40.2(d), 89), data portability rights (Article 20), security by design (Article 32), and more generally technical measures (e.g. Article 17.2). At the same time, the GDPR requires that the controller keeps a proper administration to demonstrate compliance (Article 30), departing from the old regime (under the DPD) where controllers had to register their operations with the data protection supervisor.
Next to these novel obligations, the regulation takes enforcement seriously. One of the biggest failures of previous regimes of data protection law was a paramount lack of enforcement, providing no incentive whatsoever to comply. The enforcement chapter of the GDPR, however, provides for a close-knit network of enforcement activities, by individual persons, non-profit organizations, and by the supervisors.
Chapter VIII provides the following enforcement mechanisms, under the heading of ‘Remedies, liability and penalties’:
5.6 Privacy and Data Protection Revisited
In this chapter, we have explored human rights law and investigated the ‘workings’ of the right to privacy in the context of the ECHR, and the right to data protection in the context of the CFREU, as further protected by the GDPR. This cannot be more than a first impression of relevant applicable law. Many relevant provisions and other legislation have not been discussed. The PDPD has not been discussed, the ePD (and its draft successor) have not been detailed. Convention 108 of the Council of Europe has been ignored,27 and a further exploration of the differences between EU and US law has been similarly left aside.
(p.161) In Chapter 10 we will return to the subject of EU data protection law with an eye to the increasingly code-driven nature of our environment, highlighting the unique nature of EU data protection rights with regard to automated decisions based on the processing of personal data.
On the right to privacy
Council of Europe, Guide on Article 8 of the European Convention on Human Rights. Right to respect for private and family life, home and correspondence, updated on 31 August 2018. https://www.echr.coe.int/Documents/Guide_Art_8_ENG.pdf.
Korff, Douwe, 2008. The Standard Approach under Articles 8–11 ECHR and Art. 2 ECHR. https://www.pravo.unizg.hr/_download/repository/KORFF_-_STANDARD_APPROACH_ARTS_8-11_ART2.pdf.
Mowbray, Alastair. 2005. ‘The Creativity of the European Court of Human Rights’. Human Rights Law Review 5 (1): 57–79. https://doi.org/10.1093/hrlrev/ngi003.
On the concept of family resemblance
Biletzki, Anat and Anat Matar. ‘Ludwig Wittgenstein’. The Stanford Encyclopedia of Philosophy (Summer 2018 Edition), Edward N. Zalta (ed.). https://plato.stanford.edu/archives/sum2018/entries/wittgenstein/ (the quote in this chapter is taken from this entry).
On freedom from and freedom to
Berlin, Isaiah. 1969. ‘Two Concepts of Liberty’. In Four Essays on Liberty, edited by Isaiah Berlin, 118–73. Oxford and New York: Oxford University Press.
On data protection law
European Union Agency for Fundamental Rights (FRA). 2018. ‘Handbook on European Data Protection Law—2018 Edition’. http://fra.europa.eu/en/publication/2018/handbook-european-data-protection-law.
Journal. International Data Privacy Law. https://global.oup.com/academic/product/international-data-privacy-law-20444001.
Kuner, Christopher. 2007. European Data Protection Law: Corporate Regulation and Compliance. 2nd ed. New York: Oxford University Press (see updates per chapter at: http://global.oup.com/booksites/content/9780199283859/updates/).
Westin, Alan. 1967. Privacy and Freedom. New York: Atheneum. (the quotation is taken from p. 7).
On privacy as freedom from unreasonable constraints on identity construction
Agre, Philip E., and Marc Rotenberg. 2001. Technology and Privacy: The New Landscape. Cambridge, MA: MIT (quotation taken from p. 7).
On the difference between privacy and data protection
Kokott, Juliane, and Christoph Sobotta. 2013. ‘The Distinction between Privacy and Data Protection in the Jurisprudence of the CJEU and the ECtHR’. International Data Privacy Law 4 (3), 222–28. http://idpl.oxfordjournals.org/content/3/4/222.full?sid=a0d12330-d8f3-4387-a7dc-58905c9379a2.
On data protection by design and on legal protection by design
Hildebrandt, Mireille. 2017. ‘Saved by Design? The Case of Legal Protection by Design’. NanoEthics, August, 1–5. https://doi.org/10.1007/s11569-017-0299-0.
Hildebrandt, Mireille, and Laura Tielemans. 2013. ‘Data Protection by Design and Technology Neutral Law’. Computer Law & Security Review 29 (5): 509–521.
(1) This refers to para. 66 of Wittgenstein’s Philosophical Investigations. See the correct reference to the Stanford Encyclopedia entry under references.
(2) The Tractatus is Wittgenstein’s seminal work, preceding his Philosophical Investigations. In the latter, he rejects propositional logic and definitions in terms of sufficient and necessary reasons, though he endorsed them in the former. From the perspective of the latter, the view point taken in the former is just one ‘language game’ amongst many others, noting that the former should not claim a monopoly on understanding meaning.
(3) Katz v. United States, 389 U.S. 347, 360 (1967), confirmed in, e.g. California v. Greenwood, 486 U.S. 35, 41 (1988).
(4) First relevant case was Griswold v. Connecticut, 381 U.S. 479 (1965).
(5) ECtHR, 16 December 1992, Application no. 13710/88 (Niemietz v. Germany), regarding the search of a law firm; ECtHR, 25 June 1997, Application no. 20605/92 (Halford v. UK), regarding the interception of telephone calls at work.
(6) Airey v. Ireland, 9 October 1979, Series A, no. 32, para. 24.
(7) ECtHR, 2 August 1984, Application no. 8691/79 (Malone v. UK).
(8) ECtHR, 24 April 1990, Application no. 11801/85 (Huvig & Kruslin v. France).
(9) ECHR, 6 September 1978, Application no. 5029/71 (Klass v. Germany).
(10) ECHR, 29 June 2006, Application no. 54934/00 (Weber & Saravia v. Germany).
(11) E.g. ECtHR, 1 July 2008, Application no. 58243/00 (Liberty and Others v. the United Kingdom); ECtHR, 18 May 2010, Application no. 26839/05 (Kennedy v. the United Kingdom); ECtHR, 4 December 2015, Application no. 47143/06 (Roman Zakharov v. Russia); ECtHR, 12 January, Application no. 2016 37138/14 (Szabó and Vissy v. Hungary); ECtHR, 19 June 2018, Application no. 35252/08 (Centrum För Rättvisa v. Sweden); ECtHR, 13 September 2018, Application nos. 58170/13, 62322/14 and 24960/15 (Big Brother Watch and Others v. the United Kingdom).
(12) ECtHR, 13 September 2018, Application nos. 58170/13, 62322/14 and 24960/15 (Big Brother Watch and Others v. the United Kingdom).
(13) Privacy Act of 1974, 5 U.S.C. § 552a, see: https://www.gpo.gov/fdsys/pkg/USCODE-2012-title5/pdf/USCODE-2012-title5-partI-chap5-subchapII-sec552a.pdf.
(14) European Union, Treaty on European Union (Consolidated Version), Treaty of Maastricht, 7 February 1992, Official Journal of the European Communities C 325/5; 24 December 2002, available at: http://www.refworld.org/docid/3ae6b39218.html, for the TFEU see above footnote 8 in Chapter 4.
(15) Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA.
(16) Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications).
(17) https://edpb.europa.eu, its Opinions and Guidelines can be found at: https://edpb.europa.eu/our-work-tools/general-guidance/gdpr-guidelines-recommendations-best-practices_en.
(18) Art. 2 GDPR.
(19) Directive (EU) 2016/680 on the protection of individuals with regard to the processing of personal data by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the free movement of such data.
(20) CJEU, 19 October 2016, Case 582/14 (Patrick Breyer v. Germany).
(21) CJEU, 13 May 2014, C-131/12 (Google Spain v. Costeja). See also EDPB (formerly Art. 29 WP), Opinion 1/2010 on the concepts of ‘controller’ and ‘processor’.
(22) CJEU, 5 June 2018, C-210/16 (Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v. Wirtschaftsakademie Schleswig-Holstein GmbH).
(23) See also CJEU, 29 July 2019, Case C-40/17 (Fashion ID), where the Court ruled that ‘The operator of a website, such as Fashion ID GmbH & Co. KG, that embeds on that website a social plugin causing the browser of a visitor to that website to request content from the provider of that plugin and, to that end, to transmit to that provider personal data of the visitor can be considered to be a controller, within the meaning of Article 2(d) of Directive 95/46. That liability is, however, limited to the operation or set of operations involving the processing of personal data in respect of which it actually determines the purposes and means, that is to say, the collection and disclosure by transmission of the data at issue.’
(24) EDPB (formerly Art. 29 WP), Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, WP217, at 3.
(25) EDPB (formerly Art. 29 WP), 10 April 2018, Guidelines on consent under Regulation 2016/679, WP259.rev.1, 23.
(26) EDPB (formerly Art. 29 Working Party), 10 April 2018, Guidelines on transparency under Regulation 2016/679, WP260.rev.1, 33, 35.
(27) The 1981 Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (CETS No. 108), https://www.coe.int/en/web/data-protection/convention108-and-protocol.