By Jorge Tisné Niemann*

(1) Freely given consent
(2) Specific consent
(3) Informed consent
(4) Unambiguous consent
(1) Mass digitisation as a new context of imbalance of power
(2) Additional safeguards provided by the GDPR to protect users’ rights and freedoms


Companies and public bodies have realized that accessing and controlling the information of individuals is essential for today’s economic and political systems. The new tendency of collecting information has been called surveillance capitalism. [1] Even though personal data is increasingly becoming a precious asset for trade and security, surveillance capitalism comes with its own problems, especially regarding how the individuals will exercise their fundamental right to data protection under the new technologies. [2]-[3]

Users have traditionally relied on individual consent to maintain control over their collected data. [4] This is because prior to the introduction of computerised systems, processing was done by written means, which meant that the risks of infringements were limited to a narrower scale. [5] Yet, the digital era has made it difficult for users to understand how their information is being collected, used, combined and shared. Furthermore, it has become difficult to understand how to exercise individual rights, particularly because most of the time it is uncertain where the information is, who are the controllers and for what purposes is has been used.

Accordingly, the purpose of this blog is to critically discuss the role of consent in the context of mass digitization and how the General Data Protection Regulation (“GDPR”) compensates for the restrictions on consent. For this purpose, the second section will describe the notion of consent as a general legal basis for processing under the GDPR. The third section will examine the specific situations in which consent is still a fundamental legal basis. The fourth section will analyse the imbalance of power present in the digital environment, along with an overview of the additional safeguards provided by the GDPR to guarantee the rights and freedoms of individuals.

The blog concludes that even though consent remain as a necessary legal ground for processing under certain circumstances, mass digitisation has inevitably limited its effectiveness to guarantee data subjects’ control. Nevertheless, the GDPR provides sufficient safeguards to guarantee the rights of data subjects in the digital environment.


Consent has been understood as the best way to secure individuals’ control over their personal information. Consent stimulates the subjects’ trust and is “the most global standard of legitimacy”. [6] This has also been called as the principle of informational self-determination. [7]

Since article 8(2) of the Charter of Fundamental Rights of the European Union states that consent is a key element regarding the protection of personal data [8], the Article 29 Data Protection Working Party has undertaken efforts to explain the sense and scope of its notion under the data protection framework. [9] It should be noted that consent as legal basis for processing is not new under the GDPR. Article 7 of the Directive 95/46/EC of the European Parliament and of the Council on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data (“Directive 95/46/EC“) mentioned data subject’s unambiguous consent as the first listed legal criteria.

Today, in order to demonstrate the validity of a consent, it is mandatory to comply with different elements and conditions provided by the GDPR. Those elements include a definition of consent [10], while conditions are listed in Article 7 of the GDPR. Elements and conditions will be described together hereunder as they are closely related.

(1) Freely given consent

The data subject must have a real choice to accept or reject the terms of the consent. In other words, they cannot be obliged or compelled to agree. Neither should they be threatened or reasonably expect to suffer detriment due to their denial. The data subject must be free to discuss the terms and conditions. [11] Withdrawal is an essential condition for consent. [12] Its exercise must be possible in any time and by easy means (Recitals 42 and 58, and article 7(3) GDPR).

A freely given consent is essential not just for the data subject, but also for the controller, as he has the obligation to demonstrate that the consent complies with GDPR standards (Article 7(1) GDPR). [13] Regarding this obligation, and as seen by the Court of Justice of the European Union (“CJEU”), in the online environment it is not always clear who the controller is when different entities are involved in the processing of persona data. [14]

(2) Specific consent

The purposes of a consent must be clear and precise (Article 6(1)(a) GDPR). This is directly connected with the purpose limitation principle under Article 5(1)(b). [15] Data subjects must be provided with enough information to understand the purposes of processing, allowing them to maintain control over their collected information. Consent may be provided for different purposes, but all of them must be clearly stated. Ambiguous purposes or vague descriptions of processing or controllers are unlawful. In the case of multiple purposes, granular consent should be provided. In case the collected data is required for purposes different from the ones that have been previously authorised, a new consent should be obtained. [16]

(3) Informed consent

To have a freely given and specific consent, the data subject must have enough information to understand the content of what they are agreeing to. The GDPR does not define how the information needs to be provided, so it can be done by written, oral, audio or video means. [17] However, Article 7(2) GDPR mandates that in a written declaration concerning other matters, the consent request must be presented in a clearly distinguishable way, in an intelligible and easy accessible form, and using plain and clear language.

Clear and precise language should be used, considering an average person and the type of public to which the information is aimed. The minimum information to be provided are the controllers identity, purposes of processing, type of data collected, the possibility to withdraw the consent, any involvement of automated decision-making, and any possible risks that data will be transferred to countries or international organizations which do not meet adequacy standards or appropriate safeguards. [18]

(4) Unambiguous consent

Consent must be given in a clear affirmative action. The affirmative action should eliminate any ambiguity and demonstrate that the data subject made a conscious decision and took a positive action (contrary to an acceptance by omission). The action of acceptance must be individualized from other actions, so the consent can be clearly distinguished and proven. Such acceptance should be specific and can be given by any means that comply with the GDPR’s standards. [19]


Having in mind the importance and the specific regulation of consent under the GDPR, it is worth noting that the mass digitisation of the economy has challenged the notion of consent as a safe basis for processing and guaranteeing the user control over their information. In the words of the European Commission:

[…] in the online environment – given the opacity of privacy policies – it is often more difficult for individuals to be aware of their rights and give informed consent. This is even more complicated by the fact that, in some cases, it is not even clear what would constitute freely given, specific and informed consent to data processing, such as in the case of behavioural advertising, where internet browser settings are considered by some, but not by others, to deliver the user’s consent. [20]

Despite the challenges involved in the online environment for data privacy, the risk of limiting or even discarding consent as a legal ground in some specific cases would significantly threaten the individuals’ rights and freedoms, and as such, the control they may exercise regarding the collected information. Hence, under the GDPR, consent remains as an essential legal basis for children’s consent, sensitive data, automated individual decision-making including profiling, and transfers to a third country or an international organization. These cases will be described hereunder.

Regarding the first situation, Article 8 GDPR states that consent for information society services, when provided to a child under the age of 16, will only be lawful if the controller obtains and makes reasonable efforts to verify the authorisation of the holder of the parental responsibility over the child. Over 16 years of age, the child may give their own consent, but the controller must make sure that clear and plain language has been used so the children can easily understand it (Recital 58). Member states may provide lower ages for this consent requirement, but not below 13 years. [21]

The higher level of protection required for children is further explained in Recital 38. [22] This recital also mentions an exception to consent when “the consent of the holder of parental responsibility should not be necessary in the context of preventive or counselling services offered directly to a child”.

The second situation refers to sensitive data. Article 9 GDPR states, as general rule, that the processing of sensitive personal data is prohibited. [23] However, different legal grounds are provided as exceptions for this prohibition, consent being the only one that really relies on the decision of the data subject. Similar provisions regarding the necessity of consent before processing sensitive data were included in Directive 95/46/EC. Furthermore, the CJEU has expressly stated that the fact of publishing someone else’s health information without prior consent is an infringement of the GDPR. [24]

A third situation arises where automated individual decision-making and profiling is carried out. [25] Article 22 of the GDPR states that people have the right to not be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning them or significantly affects them, except when it is based on their explicit consent or in certain other limited circumstances. In this case, GDPR protects the autonomy of the data subjects as they are free to allow such processing to be carried out. Automated processing has the potential to create important risks for the subject’s rights and freedoms, as constant surveillance involving collection and analysis of information may generate discriminatory and unfair situations/ [26]

Finally, it must be noted that consent is not just an instrument to protect the rights and privacy of individuals, but also to guaranty the exercise of their freedoms. Individual liberty also allows people to resign some of the rights GDPR intends to guarantee. This is the case for Article 49 GDPR, where in the absence of certainty of appropriate safeguards, including binding corporate rules, a transfer or a set of transfers of personal data to a third country or an international organization shall take place if “the data subject has explicitly consented to the proposed transfer, after having been informed of the possible risks of such transfers for the data subject due to the absence of an adequacy decision and appropriate safeguards”. This is an example of how consent is not just a tool to protect the data subject, but also to allow him to take risks when it has been duly informed.


As seen above, consent remains as an important basis for processing data under certain situations in which serious risks for data subject’s rights may be encountered. However, consent is not always an adequate basis for processing personal information, and furthermore, the digital environment has created significant constraints that challenges the validity of consent in modern times.

This section will be divided into two subsections. The first one will analyse several situations which can arise in the digital environment where an imbalance of power challenges consent as a trustworthy legal basis for processing. The second one will outline additional safeguards within the GDPR which together can secure the users’ control over their collected data, regardless of the legal basis used for processing.

(1) Mass digitisation as a new context of imbalance of power

It is well known that in some cases an imbalance of power between two subjects can turn consent an invalid legal basis for processing. [27]-[28] Individuals are not always able to give a real and genuine consent, since they may reasonably expect some sort of detriment due to their rejection. Traditionally these cases refer to public authorities [29] and the employment context. [30]

Yet the main constraints faced by consent can be observed in the digital environment. Even though the relation between cyberspace and consent has been analyzed since Directive 95/46/EC [31], mass digitisation brought a new kind of imbalance of power. [32] Big data, the Internet of Things (“IoT”), machine learning and artificial intelligence have made it almost impossible for data subjects to control their information, exercise their rights and truly understand how their information is being used, transferred and stored. This is becoming a growing problem as online monitoring and tracking technologies have become part of today’s ordinary life.

Despite the fact that it was hoped that the enforcement of the GDPR could challenge the risks technology presented for data collection [33], in practice consent has proven to be incapable of meeting the quality standards required by the GDPR in the digital era.

One of those issues relates to privacy policies in eCommerce. It is common that people ignore the content of the privacy settings with which they agree when using online platforms. Moreover, some platforms cannot even be used or accessed without the prior consent of the users to the privacy policies, making it almost mandatory to accept them, even if they are not clear or specific. Privacy policies have been criticised for their extension [34], use of technical or legal jargon and complex structures. [35] The practice of drafting policies in a confusing way creates “consensual exhaustion” [36] which explains the lack of interest on the part of users to actually read them. As an example of the incorrect presentation of privacy preferences, the CJEU, in accordance with Recital 32 GDPR, stated that consent is invalidly constituted by means of pre-ticked checkboxes that the user needs to deselect to express their refusal. [37]

Moreover, additional constraints on consent can arise in the context of smart cities and IoT, where the development of internet and artificial intelligence allows objects to communicate to access and share information without human intervention. [38] With sensors and cameras being installed in almost all portable and non-portable devices, whether located on public or at private locations, personal data is constantly being collected and processed. It is also becoming easier to collect data, as devices are being manufactured smaller, cheaper, wireless connected and with low privacy security restrictions. [39] Additionally, it can be difficult to obtain high quality consents, as privacy notices are not usually included as part of the design of devices. Consequently, the interfaces of these devices (screens, keyboards and other mechanisms) do not allow the user to be informed and actively express their preferences on the policies. [40]-[41] 

Interest has focused on privacy in the context of self-driving cars. Such vehicles will be connected through the internet, sharing information about the environment and the users. Hence, personal data of the driver, passengers and even third parties surrounding the vehicles will be collected and processed. The privacy issues have to do with defining when and how the different data subjects will give their consents prior to the vehicles’ collection and exchange of information. [42]

Similar observations have been made regarding big data, which can be used to process important quantities of data in order to predict future behaviors, create profiles of data subjects, (therefore allowing personalised online experiences through for example personalized advertisements), prevent crime or identify political orientations. [43] This is why it has been stated that big data at least challenges consent as a legal basis for processing, with particular concerns around purpose limitation [44], algorithmic transparency and data minimization. [45] The Working Party is clear that even though collective and individual benefits are expected from big data, it also requires new and innovative thinking about how to practically apply data protection principles to tackle privacy issues. [46]

Aside from the practical issues mentioned above, consent also encounters problems as an appropriate lawful basis from a political-economic scope. Consumers will engage actively in the eCommerce economy if they trust in the implemented levels of protection for their personal information. On the contrary, distrust on the standards may affect the full potential of the modern economy, with the corresponding repercussion on economic growth and people’s wellbeing. [47]

Scholars have discussed privacy issues arising from new technologies. Propositions to provide adequate safeguards for this new context have been made in relation to consents obtained at the time of processing and also for consents given prior to the collection of data. [48] Also, privacy by design and privacy impact assessments have been claimed as possible solutions. [49]-[50] Others have argued that consent should be directly replaced by different means in order to protect individual’s rights and freedoms, and in turn, make the data controllers fully accountable for the processing of the user’s information. [51]

Despite the diversity of approaches (from improvements up to moving away from consent), it is certain that consent, especially on the digital environment, is facing significant pressure, and inevitably it has become limited as an adequate ground to guarantee a data subject’s control over their information. But the core purpose of data protection is securing the rights and freedoms of data subjects, independently of the legal grounds that justify processing. Therefore, even though consent is experiencing problems, GDPR contains other elements that secure the fundamental right of data privacy.

(2) Additional safeguards provided by the GDPR to protect users’ rights and freedoms

Some elements contained in the GDPR work as safeguards to protect a data subject’s rights. It is relevant to mention how those elements interact to create a robust system, which jointly compensate for the limitations and restrictions that consent is experiencing in the digital environment.

Despite the fact that the GDPR’s elements have already been the subject of criticism due to surveillance concerns [52], the benefit of such elements rely on their general application regardless of the legal basis used for processing. Hence, even though consent may not be used, or be limited by practical issues, the following elements help to guarantee data subjects’ interests and stimulate their trust in the legal system.

Data protection principles are key elements to protect the data subject’s rights (Article 5 GDPR). They refer to lawfulness, fairness and transparency in processing; purpose limitation; data minimization; accuracy; storage limitation; integrity and confidentiality; and accountability. [53] These principles, in turn, are closely related with other provisions of the GDPR. For example, transparency relates to the obligation of controllers and processor to provide the data subject with the information listed by the GDPR (Articles 13 and 14). Also, controllers must comply with the obligation of giving transparent information, communication and modalities for the rights of the data subject (Article 12). They are also obliged to communicate with the data subject in the event a personal data breach is likely to result in a high risk to the rights and freedoms of natural persons (Article 34).

Purpose limitation is a relevant principle since collected data must be used for specified, explicit and legitimate purposes, and not be further processed in a way incompatible with those purposes, as its infringement will limit the subject’s control. [54] The purpose limitation sets out restrictions for controllers on how to use the collected data. Even though consent may not be used as legal ground, this principle mandates that the data subject is informed in a clear and specific way in order to understand what kind of processing will be carried out with the information. Compliance with this principle is especially important when controllers can combine data from different sources by using complex algorithms, which prevents users from clearly identifying the real purposes of the processing. [55]

Purpose limitation also connects with the principle of data minimization, which mandates that controllers collect only the adequate, relevant and necessary data in connection with the specific processing purpose. However, the same problems described for purpose limitations affect the compliance of this principle, since if the data subjects are incapable of understanding for what kind of processing will be their data used, them it also turn difficult to determinate the adequate and necessary information to carry out those purposes.

Therefore, as digitization is challenging not only consent but traditional principles of data privacy, accountability has become an important principle under the GDPR, since the controller must be able to demonstrate compliance with the law and shall be responsible for infringements. In general terms, to demonstrate compliance with GDPR, both controller and processor (when applicable) must comply with the mentioned principles, and with all other material obligations provided by GDPR. For example, two relevant duties are to implement the appropriate technical and organizational rules to meet the requirements of the regulation (Article 25) and to carry out data protection impact assessments prior to processing what is likely to result in high risk to the freedoms and rights of natural persons (Article 35). [56]

Moreover, the establishment of data subjects’ rights under the GDPR [57] and the creation of a supervisory authority with powers to impose significant administrative fees are a strong disincentive for data controllers to avoid compliance with GDPR, especially when it comes to meaningful consents.

The combination of the above mentioned elements compensate for the original imbalance of power between data subjects and controllers (especially in the digital environment). They create a responsibility for the controller and processor to demonstrate compliance with GDPR. Hence, the users can rely not just on their own efforts to maintain control over their data (exercise of rights), but on a robust system built by different components (principles, obligations, supervisory authorities, high sanctions) that are meant to protect them from data protection breaches.


Consent has traditionally been the usual legal basis on which processing of personal data has relied. Under the GDPR, in order to understand that the data subject has given a meaningful consent, such declaration of intention must be freely given, specific, informed and unambiguous. However, the digital environment (including Big Data, the IoT, machine learning and artificial intelligence) has challenged the efficiency of consent as the best mechanism to ensure data subject’s control over the way their data is collected, storage, used, combined, shared and owned.

When assessing the suitability of consent, it must be noted that consent is still a valid and necessary legal ground when processing involves children’s consent, sensitive data, automated individual decision-making including profiling and in case of transfers to third countries or organizations.

Aside from these specific situations, often valid consents are challenged when faced with the issue of imbalance of power. Such imbalance can be seen in the context of public authorities or employment. But the main constrains appear in the digital era. To tackle this scenario, the incorrect notion that consent is the only or primary ground to process personal data must be left behind. The best approach is to rethink how the data subject’s rights might be best protected, regardless of the legal ground in which processing relies. This point of view enables us to dial with the risks that mass digitisation represent for data privacy, but at the same time, take advantage of the immense benefits that digital economy and the free flow of data may imply for our collective wellbeing.

Notwithstanding the role of consent in the digital era, the GDPR provides a battery of elements that compensate for the original imbalance of power between data subjects and controllers. Data protection principles, the existence of a supervisory authority with powers to impose significant administrative fees and the recognition of several rights of the data subject may be mentioned as relevant elements to assess the efficiency of data protection policy and discourage infringements of the regulation.

On the basis of the foregoing, the economic and political benefits expected from eCommerce and the digital environment appears to remarkably exceed the limitations consent is experiencing (except for the cases above commented), as the GDPR contains sufficient cumulative elements that jointly represent a robust system to protect the data subject’s rights and freedoms. Therefore, efforts in the near future should be focused on the improvement and strengthening of the data protection safeguards, regardless if consent is eventually replaced or further limited as a legal ground for processing.


*PhD in Law, Universidad de los Andes, Chile. Student of LLM in Innovation, Technology and the Law, University of Edinburgh, Scotland.

[1] S Zuboff, The age of surveillance capitalism: the fight for the future at the new frontier of power (2019) at 8 states: “surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behavior”.

[2] Data protection is a fundamental right under article 8 of the Charter of Fundamental Rights of the European Union, as well as in Article 16(1) of the Treaty on the Functioning of the European Union.

[3] E Politou, E Alepis and C Patsakis, “Forgetting personal data and revoking consent under the GDPR: Challenges and proposed solutions” (2018) 4 Journal of Cybersecurity at 3.

[4] K Uta and C Andrew, Information Technology Law (2016) at 391.

[5] Moakes J (1986) “Data Protection in Europe – Part 1”, 1 Journal of International Banking Law 77 (82).

[6] Politou et al (n 3) at 5.

[7] Article 29 Working Party, Opinion 15/2011 on the definition of consent, WP187, adopted on 13 July 2011 at 8; S Ziegler, Internet of Things Security and Data Protection (2019) at 99. Additional definitions on the notion of informational self-determination in A Tamò-Larrieux, Designing for Privacy and its Legal Framework Data Protection by Design and Default for the Internet of Things (2018) at 30-31.

[8] Article 8(2) of the Charter of Fundamental Rights of the European Union states: “Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law”.

[9] See Article 29 Working Party (n 6) and Article 29 Working Party, Guidelines on consent under Regulation 2016/679, WP259 rev.01, version of 10 April 2018.

[10] Article 4(11) GDPR defines consent as: “‘any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her” (the highlighted is ours).

[11] Article 29 Working Party (n 9) at 5.

[12] The context of freedom in which the consent is given is left open for interpretation, as article 7(4) GDPR uses de concept inter alia that means that the specific situations in which consent was given may be review to determine if it was freely given or not.

[13] The burden of proof relies on the controller as stated by L Feiler, “The EU General Data Protection Regulation (GDPR): a commentary” (2018) at 87.

[14] Case C-40/17, Fashion ID GmbH & Co. KG v Verbraucherzentrale NRW eV [2019] CJEU.

[15] Article 5(1)(b) describes that personal data shall be “collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes; further processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes shall, in accordance with Article 89(1), not be considered to be incompatible with the initial purposes”.

[16] Article 29 Working Party (n 9) at 11-12.

[17] Article 29 Working Party (n 9) at 14.

[18] Article 29 Working Party (n 9) at 13-14.

[19] Article 29 Working Party (n 9) at 15-17.

[20] Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions. A comprehensive approach on personal data protection in the European Union. Accessed here in November 15, 2019.

[21] Article 29 Working Party (n 9) at 23-17.

[22] Recital 38 states that “children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child”.

[23] Sensitive data includes racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation.

[24] Case C-101/01, Bodil Lindqvist v Åklagarkammaren i Jönköping [2003] CJEU.

[25] Recital 71 GDPR explains: “data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes ‘profiling’ that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her”.

[26] Article 29 Data Protection Working Party, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, adopted on 3 October 2017, at 10, include situations such as “denying people access to employment opportunities, credit or insurance, or targeting them with excessively risky or costly financial products”.

[27] In this sense, recital 43 GDPR states: “In order to ensure that consent is freely given, consent should not provide a valid legal ground for the processing of personal data in a specific case where there is a clear imbalance between the data subject and the controller, in particular where the controller is a public authority and it is therefore unlikely that consent was freely given in all the circumstances of that specific situation”.

[28] Same idea in Tamò-Larrieux (n 7) at 89; T. H. A. Wisman, “Privacy, Data Protection and E-Commerce”, in A Lodder and A Murray (eds), EU Regulation of E-Commerce (2017) at 357.

[29] When personal information is collected for public purposes, obtaining meaningful consents can be inefficient or disproportionate. This is why GDPR offers exceptions or alternatives for processing, such as the necessity to comply with a legal obligation to which the controller is subject (article 6(c)), or the necessity for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller (article 6(e) GDPR). See Article 29 Working Party (n 9) at 26.

[30] Article 29 Working Party (n 9) at 6-7.

[31] H Rowe, Tolley’s Data Protection Act 1998: a practical guide (2000) at 216-218.

[32] Ziegler (n 7) at 100.

[33] Zuboff (n 1) at 481.

[34] Against the critics for the extension of policies, for M Hintze, “In defense of the long privacy statement (The state of cyberlaw: Security and privacy in the digital age)” (2017) 76 Maryland Law Review 1044.

[35] K Litman-Navarro, “We Read 150 Privacy Policies. They Were an Incomprehensible Disaster”. Accessed here in 14 January 2020. In the same line, Article 29 Working Party (n 9) at 14 is clear when stating: “Controllers cannot use long privacy policies that are difficult to understand or statements full of legal jargon. Consent must be clear and distinguishable from other matters and provided in an intelligible and easily accessible form. This requirement essentially means that information relevant for making informed decisions on whether or not to consent may not be hidden in general terms and conditions”. 

[36] Wisman (n 28) at 358. The author even mentions that “It is common practice for lawyers to advise their clients to make their terms and conditions long and fuzzy, as opposed to concise and clear”.

[37] Case C 673/17, Planet49 GmbH v Bundesverband der Verbraucherzentralen und Verbraucherverbände — Verbraucherzentrale Bundesverband eV [2019] CJEU 65.

[38] M C Gaeta, “Data protection and self-driving cars: the consent to the processing of personal data in compliance with GDPR” (2019) 24 Communications Law 15.

[39] L Edwards, “Privacy, Security and Data Protection in Smart Cities: A Critical EU Law Perspective”, European Data Protection Law Review (2016) at 2-3.

[40] Edwards (n 39) at 17; Article 29 Working Party, Opinion 8/2014 on the Recent Developments in the Internet of Things, adopted on 16 September 2014 at 7. 

[41] Complementary description on how consent interacts with the internet of things in L Urquhart, H Schnadelbach and N Jager “Adaptive Architecture: Regulating Human Building Interaction” (2019) International Review of Law, Computers and Technology at 22.

[42] Gaeta (n 38).

[43] K Krasnow and P Bruening. “Big Data Analytics: Risks and Responsibilities” (2014) 4 International Data Privacy Law at 89.

[44] Uta & Andrew (n 4) at 355.

[45] On this discussion see Edwards (n 39) at 22-25. Same idea in M Macenaite, “The ‘Riskification’ of European Data Protection Law through a two-fold Shift” (2017) 8 at 515-516.

[46] Article 29 Working Party, Opinion 221/2014 on Statement on Statement of the WP29 on the impact of the development of big data on the protection of individuals with regard to the processing of their personal data in the EU the Recent Developments in the Internet of Things, adopted on 16 September 2014 at 2.

[47] European Commission, “Safeguarding Privacy in a Connected World A European Data Protection Framework for the 21st Century” (2012) Communication COM(2012) 9. Accessed here in 14 November 2019. This idea is also stated in Article 29 Working Party (n 40) at 3.

[48] S Elahi, “Privacy and consent in the digital era” (2009) 14 Information Security Technical Report at 114.

[49] Edwards (n 39) at 27-32. The author acknowledges that the proposed improvements will not solve the problem of consent in the digital era, but at least are efforts to try to obtain better quality consents.  

[50] Article 29 Working Party (n 40) at 21-24.

[51] In this line, Edwards (n 39) at 2-3 states: “In conclusion however, the paper reverts to pessimism with the view that to preserve privacy in smart cities we may need to move away from the liberal notion of ‘notice and choice’ or, in European terms, ‘consent’ and informed specific control over processing, entirely, and look instead to an ‘environmental’ model of toxic processes which should be banned or restricted notwithstanding user permission or substitute grounds for processing”. In similar words R Leenes, R Van Brakel, S Gutwirth, P De Hert, Data protection and privacy the age of intelligent machines (2017) at 75.

[52] J Andrew and M Baker, “The General Data Protection Regulation in the Age of Surveillance Capitalism” (2019) Journal of Business Ethics, suggests that consent should be required even to trade anonymized data.

[53] For reference about these principles see Feiler (n 13) at 73-78.

[54] Notwithstanding the purposes considered as compatible by the GDPR such as public interest, scientific or historical research purposes or statistical purposes (article 5(b) GDPR).

[55] J Rauhofer, “Of Men and Mice: Should the EU Data Protection Authorities’ Reaction to Google’s New Privacy Policy Raise Concern for the Future of the Purpose Limitation Principle?” (2015) 1 European Data Protection Law Review at 5-15.

[56] Other obligations are to provide the necessary information to data subject (articles 13-14), guarantee and comply with the exercise of data subjects’ rights, restrictions to processing (article 23), maintain records of processing activities (article 30), cooperate with supervisory authority (article 31), taking all security measure such as anonimisation, pseudonymisation and encryption (article 32), notification of data breaches to supervisory author and data subjects (articles 33-34), designation and correct operation of a data protection officer when applicable (articles 37-39), and to comply with provisions of transfer of persona data to third countries or international organizations (articles 44-46).

[57] Individuals have the right of access (article 15), rectification and erasure (article 16), right to be forgotten (article 17), restriction of processing (article 18), right to receive notification regarding rectification or erasure of personal data or restriction of processing (article 19), right to data portability (article 20), object automated individual decision making (article 21). Also it should be noted the right to refuse or withdraw consent (article 7), receive communication of data breaches (article 34), right to lodge a complaint with a supervisory authority (article 77), right to an effective judicial remedy against a supervisory authority or against a controller or processor (articles 78-79), and the right to receive compensation for the damage suffered (articles 82).