The last and final version of the ePrivacy Regulation was finally delivered by the Council of the European Union. The work will finally move forward.

I tracked all relevant ePrivacy events since 2016. I also directly participated in the works as an expert and advisor. While this is likely the last analytical entry in this drama before settling on the final version, it is informative to look at the versions of the European Commission (EC), the changes introduced by the European Parliament (EP), and the Council. The EP version is best for privacy, while the Council one is the weaker one, and even self-inconsistent in certain places, which is worrying. The biggest challenge to ePrivacy is that it now fails to account for the recent technology changes, particularly how web browsers and ads technologies change.

Let’s speak of the details.

  • Does the Council ePrivacy version reduce existing privacy protection in current ePrivacy and GDPR? To some degree, perhaps yes. This despite the Council version claiming that “it does not lower the protection warranted by the GDPR”. This is curious because you simply can’t hand-wave such objective things.
  • Does it lack a visionary approach to a degree to make the “new ePrivacy” obsolete on the day it comes to force (even today)? Sadly, also yes. It for example still does not support automatic consent mechanisms.

Fines

This is the first ePrivacy regulation that will allow for fines for infringements of the protection of user data concerning electronic communication. The fines will be substantial:

  • 10 000 000 EUR or up to 2% of the total worldwide annual turnover of the preceding financial year, whichever is higher
  • and in some other cases: 20 000 000 EUR, or in the case of an undertaking, up to 4 % of the total worldwide annual turnover.


This will make ePrivacy a matter of serious business. The recommendation is to comply, even in advance.

Below I discuss both ePrivacy articles and recitals, quoting extensively.


The primary motivation for ePrivacy is here:

“Article 7 of the Charter of Fundamental Rights of the European Union ("the Charter") protects the fundamental right of everyone to the respect for private and family life, home and communications. Respect for the confidentiality of one’s communications is an essential dimension of this right, applying both to natural and legal persons. Confidentiality of electronic communications ensures that information exchanged between parties and the external elements of such communication, including when the information has been sent, from where, to whom, is not to be revealed to anyone other than to the parties involved in a communication.”


It is correctly acknowledged that:

“The content of electronic communications may reveal highly sensitive information about the natural persons involved in the communication, from personal experiences and emotions to medical conditions, sexual preferences and political views, the disclosure of which could result in personal and social harm, economic loss or embarrassment. Similarly, metadata derived from electronic communications may also reveal very sensitive and personal information. “



Extraterritoriality of protections warranted

“This Regulation should apply regardless of whether the processing of electronic communications data or personal data of end-users who are in the Union takes place in the Union or not, or of whether the service provider or person processing such data is established or located in the Union or not “

Applying to data of users in the European Union, of course.


Cafe wifis need to protect user data

When providing wifi access, even if the network is open/has no passwords.

“ To the extent that those communications networks are provided to an undefined group of end-users, regardless if these networks are secured with passwords or not, the confidentiality of the communications transmitted through such networks should be protected. “

This of course does not apply to data confidentiality, cybersecurity-wise, because anyone on such networks could sniff the traffic (these days, ubiquitous HTTPS often solves this issue, minus the metadata).


Functional restriction of end-to-end encryption?

ePrivacy is supposed to be about confidentiality and privacy. It also appears that the Council version intends to expressly mandate the ability to intercept communication content:

“Regulation should not affect the ability of Member States to carry out lawful interception of electronic communications, including by requiring providers to enable and assist competent authorities in carrying out lawful interceptions, or take other measures, such as legislative measures providing for the retention of data for a limited period of time … Providers of electronic communications services should provide for appropriate procedures to facilitate legitimate requests of competent authorities “


But logically reasoning, the provider of communication technology should be unable to grant such access to content if an end-to-end encryption system is included in the system design and it is built correctly. This amounts to the lowering of the protections in the current technical protection status quo known, for example, in Apple’s iMessage, WhatsApp, or Signal.

That said another part states that the content access is prohibited (to the moment it reached the destination point)

“The prohibition of interception of electronic communications content under this Regulation should apply until receipt of the content of the electronic communication by the intended addressee, i.e. during the end-to-end exchange of electronic communications content between end-users.”

In general, this is quite unfortunate, but it follows other recent similar attempts to regulate/restrict end-to-end encryption in another regulation, originally meant to increase cybersecurity. Where this goes, in the end, is anyone’s guess.

Another exception relaxes the case of consent when a service provides accessible services for the disabled:

“Services that facilitate end-users everyday life such as index functionality, personal assistant, translation services and services that enable more inclusion for persons with disabilities such as text-to-speech services are emerging. Processing of electronic communication content might be necessary also for some functionalities used normally in services for individual use … consent should only be requested required from the end-user requesting the service taking into account that the processing should not adversely affect the fundamental rights and interest of another end-user concerned. ”

I think it’s justified and appropriate. The Council made a good choice here.


Further processing of metadata

When the user consents to the processing of metadata, can they be used for other purposes? Yes:

“ Further processing for purposes other than for which the metadata where initially collected may take place without the consent of the end-users concerned, provided that such processing is compatible with the purpose for which the metadata are initially collected, certain additional conditions and safeguards set out by this Regulation are complied with, including the requirement to genuinely anonymise the result before sharing the analysis with third parties. “

This significantly expands the ability to process metadata, which may be very privacy-sensitive.


We also have the Covid19 clause:

“Processing of electronic communications metadata for the protection of vital interests of the end-user may include for instance processing necessary for humanitarian purposes, including for monitoring epidemics and their spread or in humanitarian emergencies, in particular natural and man- made disasters. “

Apart from the risk of misuse, I do not see a problem with using the data for humanitarian purposes, as long as it is well justified and a data protection impact assessment (DPIA) is in place. The Council does not see the point for the DPIA.

This is weird because a mention of a DPIA is present elsewhere:

“Prior to the processing [of electronic communications content] in accordance with point (b) of paragraph 1 the provider shall carry out a data protection impact assessment of the impact of the envisaged processing operations on the protection of electronic communications data “



Here’s the Cambridge Analytica/"Dr. Kogan" “for science” clause:

“Processing of electronic communication metadata for scientific research or statistical purposes could also be considered to be permitted processing. This type of processing should be subject to safeguards to ensure the privacy of the end-users by employing appropriate security measures such as encryption and pseudonymisation. “

I must also add that, depending on a particular use case, encryption or pseudonymisation might not give much protection in such settings. Sensitive data remains sensitive.


Processing data for statistical purposes ...

“ Such usage of electronic communications metadata could, for example, benefit public authorities and public transport operators to define where to develop new infrastructure, based on the usage of and pressure on the existing structure.”


It seems that the Council wants to make it expressly legal to track the movements of e.g. users of the metro via wifi metadata what is confirmed later:

“Providers engaged in such practices should display prominent notices located on the edge of the area of coverage informing end-users prior to entering the defined area that the technology is in operation within a given perimeter, the purpose of the tracking, the “

How do they expect to display the prominent notices? On the entire diameter corresponding to the wifi transmitter range? That would be a lot of notices!


Cookies

“The end-user's consent to storage of a cookie or similar identifier may also entail consent for the subsequent readings of the cookie in the context of a revisit to the same website domain initially visited by the end-user.”

I see you’re interested in tracking cookies.

But it seems you maybe did not notice that major web browser vendors are increasingly restricting the use of tracking cookies. This means that the new and shiny ePrivacy Regulation would be obsolete even before its adoption.

In 2017 I said that the proposal lacked vision. Also in 2017, the European Parliament version rectified this to a significant degree. But the world did not stop there, and in the meantime we have 2020, a lot has changed since 2017. The landscape looks very different in 2020, and it will look even more different in 2022.

Some firms are preparing. It seems the Council of the EU is not among them. Too bad that this result may become the case for the whole regulatory framework in the European Union.


Privacy, Cookies, and Competition

“Conversely, in some cases, making access to website content dependent on consent to the use of such cookies may be considered, in the presence of a clear imbalance between the end-user and the service provider as depriving the end-user of a genuine choice. This would normally be the case for websites providing certain services, such as those provided by public authorities. Similarly, such imbalance could exist where the end-user has only few or no alternatives to the service, and thus has no real choice as to the usage of cookies for instance in case of service providers in a dominant position. “

Conditioning access to a website based on cookie consent would be legal (cookie walls), assuming that the user has a choice, so when other providers exist, the market is competitive. But what realistic choice would there be if all the providers joined forces to condition website access on-demand of cookie consent?


The later part even clarifies that ads cookies are OK:

“Cookies can also be a legitimate and useful tool, for example, in assessing the effectiveness of a delivered information society service, for example of website design and advertising or by helping “

Specifically, for measuring the effectiveness of ad campaigns.


“ Implementation of technical means in electronic communications software to provide specific and informed consent through transparent and user-friendly settings, can be useful to address this issue. Where available and technically feasible, an end user may therefore grant, through software settings, consent to a specific provider for the use of processing and storage capabilities of terminal equipment for one or multiple specific purposes across one or more specific services of that provider. For example, an end-user can give consent to the use of certain types of cookies by whitelisting one or several providers for their specified purposes. Providers of software are encouraged to include settings in their software which allows end-users, in a user friendly and transparent manner, to manage consent to the storage and access to stored data in their terminal equipment by easily setting up and amending whitelists and withdrawing consent at any moment. “

This does not describe the Do Not Track/Tracking Preferences Expression, but it is more close to what today’s web browsers (like Chrome or Firefox) offer. So it simply writes down the status quo. Such Council position on automatic consent signal is way behind the European Parliament version - that one expressly endorsed Do Not Track signal (as applied to web consent, but also beyond that).


There’s also that:

“As far as the provider is not able to identify a data subject, the technical protocol showing that consent was given from the terminal equipment shall be sufficient to demonstrate the consent of the end-user “

It’s unclear what this even means. Could this mean the interpretation of the DNT (assuming that the user is not identifiable)?


Automatic Software Updates Are Not OK

Unless they are limited to security:

“and provided that such updates do not in any way change the functionality of the hardware or software or the privacy settings chosen by the end-user and the end-user has the possibility to postpone or turn off the automatic installation of such updates”

Sounds fair?


Summary

The source text of this ePrivacy version is here.

This is not over yet. Now the representatives of the EC, the EP, and the Council, will meet to discuss the final version. So we still do not know the final outcome, though some common aspects are already clear. It will be tricky to find the consensus, and the process will not be transparent.

While I still think that the version of the European Parliament offers the best privacy qualities, the biggest problem with ePrivacy today is that it does not even reflect today’s status quo concerning the technology landscape, that is quickly changing. The landscape shifted and continues to change.

Logical reasoning would call to start the work on a new update immediately after the current one is adopted. The problem is I currently do not see this happening.

Did you like the assessment and analysis? Any questions, comments, complaints, or offers for me? Feel free to reach out: me@lukaszolejnik.com