We spoke with attorney Frank Stiegler about the legal background to PRYVY Analytics and Google Analytics in relation to the General Data Protection Regulation (GDPR).
Frank Stiegler has been working as a lawyer since 2006 and focuses on IT and data protection law. His podcast “Legal Bits” covers topics related to IT law and related areas of law. Privately, he has been active in the demoscene since 1997 as a musician, host and organizer in the computer art scene.
PRYVY: Hi Frank, thank you for taking the time to talk to us. We’ll jump right in: in your opinion, is the use of Google Analytics in the EU compatible with the GDPR?
Google Analytics in the EU:
To make a long story short: It’s complicated.
Frank Stiegler: I think it’s possible but in the broad use for which Google Analytics is intended after all it’s often done wrong. The service is quite easy to install and use, which is great on the one hand but problematic on the other: It’s not just a matter of “turning it on”, but you have a number of hurdles that you not only have to be aware of, but also have to overcome. You could also say that the service itself and its use are two different things.
The hurdles start with invalid consent to the storage of cookies, include missing, incorrect or misplaced privacy notices, and go all the way to the question of how to deal with the fact that data is transferred to the US and that the GDPR has set specific stipulations for said transfer. And even if you really want to get it right, you have the problem that in Germany, different supervisory authorities evaluate legal issues differently and some are quite “tough” while you hardly hear anything from others. To make a long story short: It’s complicated.
PRYVY: Google has responded and announced plans to switch to FLoC. FLoC is the abbreviation for “Federated Learning of Cohorts”, i.e. the distributed learning (or forming) of cohorts. This means that users will no longer be recognized personally, but will be divided into groups. Does this alleviate the pain in terms of data protection? What is your assessment of this?
Frank Stiegler: This is a complex and complicated topic that could easily be the subject of a 90-minute podcast. This much for now: It helps in several places, but possibly not in the way that Google would like it to, or that you might think.
To understand my answer, you firstly need to know: The “cookie dialogs” that have been plaguing us for a while now are not at all necessary because of the GDPR, but (still) because of the E-Privacy Directive. What almost no one knows: E-privacy and data protection are legally two pairs of shoes with completely different perspectives and requirements. But both areas need to be considered. Unfortunately, practically no one can deal with them confidently without expert help.
On Google FLoC:
In this respect, no, it does not make “everything better in the area of data protection”, but rather even more obscure for users, depending on the type of implementation.
As far as I know, Google invented FLoC to replace two things, namely cookies from third-party providers (third-party cookies, here being an e-privacy and a GDPR issue) and the reading of the browser fingerprint (fingerprint, a GDPR issue). Cookies are stored on the users’ end devices, the fingerprints are read and stored by the (Google) web server. With FLoC, the browser of the respective end device is supposed to replace both, so to speak, by tracking surfing behavior and then assigning the users to certain groups (“cohorts”). I have not yet been able to figure out whether a specific end device is being tracked or, as I have understood it in various articles, actual users. This can make a significant difference: If I am constantly looking for tools on my cell phone because I need them regularly at work, but at home on my private laptop I only watch tearjerker movies, in the first case I will get ads that match the device, in the second everywhere matching “me” instead of the device.
If actually the users are tracked, and not just the devices, FLoC replaces one set of legal hurdles with another, so to speak. FLoC may make sure that the advertising industry has fewer hurdles, but the data protection problem remains. For me as a user, this means that it would be good if I no longer had cookie dialogs in front of my nose everywhere, but the tracking might still be done, and in the worst case, I no longer even have a choice, so I lose control over “my” data.
In this respect, no, it does not make “everything better in the area of data protection”, but rather more obscure for users, depending on the type of implementation.
PRYVY: There are various other providers of website analytics besides Google. As a site operator, am I allowed to use tools at all that, unlike PRYVY Analytics, track the personal data of my visitors, such as age, place of residence or areas of interest, without their consent?
Frank Stiegler: I think it’s possible for personal tracking to be legally compliant without consent in certain manifestations, but doing so ensures that there will always be a residue of uncertainty as to whether “that’s okay.”
It is important to be precise in terminology: the GDPR talks about “personal” data, not “private” data. There is a kind of “particularly sensitive” data under the GDPR, but this tends to include things like sexual and religious orientation and health data, but not age, place of residence or general interests. Now to your question: “Consent” according to the GDPR is not always required, and if you have multiple legal bases on which to base tracking, it’s often even the worst. Why? Because “legally secure” consent is difficult to obtain and can be revoked at any time. Consent is therefore the hardest to get and the easiest to lose, so to speak.
Instead, it is also possible under the GDPR (Art. 6 para. 1 p. 1 lit. f) to base tracking on so-called legitimate interests. Roughly speaking, this means that if I as a company want to use my interest in sales promotion/direct advertising to look at who is visiting my website, and how they do it, I am allowed to do so unless the interests of the data subjects outweigh my interests. By the way, this also means: If both weigh equally, it is allowed. The catch is: this weighing up can rarely be done in a “legally secure” way*; because if the supervisory authority imposes a fine on me or if those affected sue me and a court subsequently sees things differently, it is of little use to me that I previously considered it permissible.*
PRYVY: Can you explain what the difference is between “personal”, “pseudonymous” and “anonymous” collection of user data?
Frank Stiegler: The distinction between “personal” and “anonymous” is something I think everyone can already imagine quite well: In one case, I am identifiable as a person – in whatever way – in the other, I am an unidentifiable part of a crowd, e.g. a number of users who visited a certain website on a certain day. A “pseudonym” also counts, roughly speaking, as personal data, but with a kind of cipher that allows only certain people to identify me. So the pseudonym allows this identification only to some, not to all. A pseudonym can be, for example, the matriculation number at college that allows the administration to identify the person but not the other students or others. When using cookies, e.g. for the purposes of online tracking, the assigned ID can be a pseudonym. Anonymous web tracking, by the way, is more complicated than you might think, at least if you want to know how many different people have visited the site. If you count every page view without differentiating between users, 100,000 views are technically correct, but may miss the important information that they came from only 50 users (and thus potentially bots).
On the approach of PRYVY Analytics:
As far as I know, the approach of PRYVY Analytics avoids as far as possible the identifiability of natural persons without completely losing the information about user-specific visits, i.e. “unique visits”.
PRYVY: At PRYVY Analytics we put a lot of emphasis on data anonymization. We have explained our technical procedure for anonymizing visitor data to you in advance. Based on this, what is your assessment of PRYVY Analytics with regard to the GDPR?
Frank Stiegler: To be clear: I have already heard from various providers that they track anonymously. Often this is simply wrong, because in fact they track pseudonymously but not anonymously. However, since pseudonymous data is also personal data according to the GDPR, but anonymous data is not, the distinction is important. As far as I know, the approach of PRYVY Analytics avoids as far as possible the identifiability of natural persons without completely losing the information about user-specific visits, i.e. “unique visits”. The method uses a hash process to generate IDs from various parameters for people who access the website running PRYVY Analytics, which do not allow website operators to draw any conclusions about the identity of the visitors without any further data. No personal data is stored*; even the IP address used for the procedure is truncated before it is used to generate the hash value. This means that tracking can be user-specific and yet anonymous. So neither website operators nor the operator of PRYVY Analytics know which people are behind the IDs, but they can still see how many different people were on the website.
I wrote “without any further data” above because it is clear that when website operators use PRYVY Analytics, the data itself is anonymous. But if, for example, they collect further data at some point that makes identification possible, then of course the anonymity is gone. Naturally, this has nothing to do with PRYVY Analytics*; the same phenomenon occurs with any service, no matter how anonymous it may be*.
PRYVY: That means, with PRYVY Analytics it is essentially not even necessary to list the use in the privacy notices?
PRYVY: What advice would you now give to site operators in the EU who want to collect data about the use of their website?
Frank Stiegler: In this regard, I always say: personal data that I do not have does not trigger any GDPR obligations for me. Every piece of personal data that I process ensures that I have to do something. This can be having to give notice, a contractual provision, obtaining consent, taking certain technical and/or organizational measures, and/or documentation. My advice is: If you can get the necessary information with anonymous data, you should only use anonymous data for tracking.
PRYVY: Finally, a look into the future: how do you assess the development of the next five to ten years in the field of e-privacy on the Internet? In your opinion, will users’ rights be increasingly strengthened, or will the big advertising companies find strategies to collect more information about Internet users?
Frank Stiegler: To be honest, I’m not particularly optimistic about the position of users. This question alone could take days of discussion. I’ll keep it as short as possible. We have a legal situation that points to outdated technologies and is opaque, especially for legal laymen. And even now, neither technology nor business models are understood by the average user. Let’s be honest: Even today, many people still respond to data protection efforts with the uncharitable phrase that they have nothing to hide, so what’s the fuss about?
Frank Stiegler’s advice to web site operators:
My advice is: If you can get the necessary information with anonymous data, you should only use anonymous data for tracking.
It’s hard to say where e-privacy is headed, given the chaotic legislative process for the planned e-privacy regulation. But one thing is clear: Anyone in legislation who demands that users declare their consent to everything and anything is, in my opinion, in denial of their duties. It is the legislations’ job to create and maintain laws that are fair for consumers, without them having to constantly take responsibility for things they obviously don’t understand. And yet, consent in many places is still – or perhaps increasingly – seen as the panacea in data protection. Consent to storage of cookies here, to advertising personalization there, to data sharing here, to profiling there.
Practically none of this is really understood by the average user in all its consequences. It “annoys” them more than anything else, and even if this has little evidence value, no one has ever told me that they now feel better protected by the cookie dialog boxes.
Therefore, I guess we will still see dialog boxes at all conceivable places where “something with data protection” happens, in order to declare consent, the refusal of which is simply too annoying at that moment. In a sense, users are being burdened with annoying clicks whose meaning and consequences they don’t understand and that they aren’t interested in, so that companies can fulfill an accountability obligation. But I think this kind of accountability is dangerous. Normally, actors have certain obligations, and if they violate them, there can be consequences. But the fact that they should always be able to prove that they are doing everything right is really turning things upside down. Imagine you have to prove that you always obeyed all the rules of the traffic regulations when driving a car! You would quickly realize how much effort that would take, without improving anyone’s situation in street traffic.
In my opinion, data protection and e-privacy would have to be completely rethought and redone if one wanted to effectively change something for those affected.
PRYVY: Thank you for your assessment!
For more information on Frank and his podcast, visit his website at: www.stiegler.legal
How “GDPR-compliant” is legally definied.