Wednesday 7 October 2015

The party’s over: EU data protection law after the Schrems Safe Harbour judgment




Steve Peers

The relationship between intelligence and law enforcement agencies (and companies like Google and Facebook) and personal data is much like the relationship between children and sweets at a birthday party. Imagine you’re a parent bringing out a huge bowl full of sweets (the personal data) during the birthday party – and then telling the children (the agencies and companies) that they can’t have any. But how can you enforce this rule? If you leave the room, even for a moment, the sweets will be gone within seconds, no matter how fervently you insist that the children leave them alone while you’re out. If you stay in the room, you will face incessant and increasingly shrill demands for access to the sweets, based on every conceivable self-interested and guilt-trippy argument. If you try to hide the sweets, the children will overturn everything to find them again.

When children find their demands thwarted by a strict parent, they have a time-honoured circumvention strategy: “When Mummy says No, ask Daddy”. But in the Safe Harbour case, things have happened the other way around. Mummy (the Commission) barely even resisted the children’s demands. In fact, she said Yes hours ago, and retired to the bath with an enormous glass of wine, occasionally shouting out feeble admonitions for the children to tone down their sugar-fuelled rampage. Now Daddy (the CJEU) is home, shocked at the chaos that results from lax parenting. He has immediately stopped the supply of further sweets. But the house is full of other sugary treats, and all the children are now crying. What now?

In this post, I’ll examine the reasons why the Court put its foot down, and invalidated the Commission’s ‘Safe Harbour’ decision which allows transfers of personal data to the USA, in the recent judgment in Schrems. Then I will examine the consequences of the Court’s ruling. But I should probably admit for the record that my parenting is more like Mummy's than Daddy's in the above example. 

Background

For more on the background to the Schrems case, see here; on the hearing, see Simon McGarr’s summary here; and on the Advocate-General’s opinion, see here. But I’ll summarise the basics of the case again briefly.

Max Schrems is an Austrian Facebook user who was disturbed by Edward Snowden’s revelations about mass surveillance by US intelligence agencies. Since he believed that transfers of his data to Facebook were subject to such mass surveillance, he complained to the Irish data protection authority, which regulates Facebook’s transfers of personal data from the EU to the USA.

The substantive law governing these transfers of personal data was the ‘Safe Harbour’ agreement between the EU and the USA, agreed back in 2000. This agreement was put into effect in the EU by a decision of the Commission, which was adopted pursuant to powers conferred upon the Commission by the EU’s current data protection Directive. The latter law gives the Commission the power to decide that transfers of personal data outside the EU receive an ‘adequate level of protection’ in particular countries.

The ‘Safe Harbour’ agreement was enforced by self-certification of the companies that have signed up for it (note that not all transfers to the USA fell within the scope of the Safe Harbour decision, since not all American companies signed up). Those promises were in turn meant to be enforced by the US authorities. But it was also possible (not mandatory) for the national data protection authorities which enforce EU data protection law to suspend transfers of personal data under the agreement, if the US authorities or enforcement system found a breach of the rules, or on a list of limited grounds set out in the decision.

The Irish data protection authority refused to consider Schrems’ complaint, so he challenged that decision before the Irish High Court, which doubted that this system was compatible with EU law (or indeed the Irish constitution). So that court asked the CJEU to rule on whether national data protection authorities (DPAs) should have the power to prevent data transfers in cases like these.

The judgment

The CJEU first of all answers the question which the Irish court asks about DPA jurisdiction over data transfers (the procedural point), and then goes on to rule that the Safe Harbour decision is invalid (the substantive point).

Following the Advocate-General’s view, the Court ruled that national data protection authorities have to be able to consider claims that flows of personal data to third countries are not compatible with EU data protection laws if there is an inadequate level of data protection in those countries, even if the Commission has adopted a decision (such as the Safe Harbour decision) declaring that the level of protection is adequate. Like the Advocate-General, the Court based this conclusion on the powers and independence of those authorities, read in light of the EU Charter of Fundamental Rights, which expressly refers to DPAs’ role and independence. (On the recent CJEU case law on DPA independence, see discussion here). In fact, the new EU data protection law currently under negotiation (the data protection Regulation) will likely confirm and even enhance the powers and independence of DPAs. (More on that aspect of the proposed Regulation here).

The Court then elaborates upon the ‘architecture’ of the EU’s data protection system as regards external transfers. It points out that either the Commission or Member States can decide that a third country has an ‘adequate’ level of data protection, although it focusses its analysis upon what happens if (as in this case) there is a Commission decision to this effect. In that case, national authorities (including DPAs) are bound by the Commission decision, and cannot issue a contrary ruling.

However, individuals like Max Schrems can still complain to the DPAs about alleged breaches of their data protection rights, despite the adoption of the Commission decision. If they do so, the Court implies that the validity of the Commission’s decision is therefore being called into question. While all EU acts must be subject to judicial review, the Court reiterates the usual rule that national courts can’t declare EU acts invalid, since that would fragment EU law: only the CJEU can do that. This restriction applies equally to national DPAs.

So how can a Commission decision on the adequacy of third countries’ data protection law be effectively challenged? The Court explains that DPAs must consider such claims seriously. If the DPA thinks that the claim is unfounded, the disgruntled complainant can challenge the DPA’s decision before the national courts, who must in turn refer the issue of the validity of the decision to the CJEU if they think it may be well founded. If, on the other hand, the DPA thinks the complaint is well-founded, there must be rules in national law allowing the DPA to go before the national courts in order to get the issue referred to the CJEU.

The Court then moves on to the substantive validity of the Safe Harbour decision. Although the national court didn’t ask it to examine this issue, the Court justifies its decision to do this by reference to its overall analysis of the architecture of EU data protection law, as well as the national court’s doubts about the Safe Harbour decision. Indeed, the Court is effectively putting its new architecture into use for the first time, and it’s quite an understatement to say that the national court had doubts about Safe Harbour (it had compared surveillance in the USA to that of Communist-era East Germany).

So what is an ‘adequate level of protection’ for personal data in third countries? The Court admits that the Directive is not clear on this point, so it has to interpret the rules. In the Court’s view, there must be a ‘high’ level of protection in the third country; this does not have to be ‘identical’ to the EU standard, but must be ‘substantially equivalent’ to it.  Otherwise, the objective of ensuring a high level of protection would not be met, and the EU’s internal standards for domestic data protection could easily be circumvented. Also, the means used in the third State to ensure data protection rights must be ‘effective…in practice’, although they ‘may differ’ from that in the EU. Furthermore, the assessment of adequacy must be dynamic, with regular automatic reviews and an obligation for a further review if evidence suggests that there are ‘doubts’ on this score; and the general changes in circumstances since the decision was adopted must be taken into account.

The Court then establishes that in light of the importance of privacy and data protection, and the large number of persons whose rights will be affected if data is transferred to a third country with an inadequate level of data protection, the Commission has reduced discretion, and is subject to ‘strict’ standards of judicial review. Applying this test, two provisions of the ‘Safe Harbour’ decision were invalid.

First of all, the basic decision declaring adequate data protection in the USA (in the context of Safe Harbour) was invalid. While such a decision could, in principle, be based on self-certification, this had to be accompanied by ‘effective detection and supervision mechanisms’ ensuring that infringements of fundamental rights had to be ‘identified and punished in practice’. Self-certification under the Safe Harbour rules did not apply to US public authorities; there was not a sufficient finding that the US law or commitments met EU standards; and the rules could be overridden by national security requirements set out in US law.

Data protection rules apply regardless of whether the information is sensitive, or whether there were adverse consequences for the persons concerned. The Decision had no finding concerning human rights protections as regards the national security exceptions under US law (although the CJEU acknowledged that such rules pursued a legitimate objective), or effective legal protection in that context. This was confirmed by the Commission’s review of the Safe Harbour decision, which found (a) that US authorities could access personal data transferred from the EU, and then process it for purposes incompatible with the original transfer ‘beyond what was strictly necessary and proportionate for the purposes of national security’, and (b) that there was no administrative or judicial means to ensure access to the data and its rectification or erasure.

Within the EU, interference with privacy and data protection rights requires ‘clear and precise rules’ which set out minimum safeguards, as well as strict application of derogations and limitations.  Those principles were breached where, ‘on a generalised basis’, legislation authorises ‘storage of all the personal data of all the persons whose data has been transferred’ to the US ‘without any differentiation, limitation or exception being made in light of the objective pursued’ and without any objective test limiting access of the public authorities for specific purposes. General access to the content of communications compromises the ‘essence’ of the right to privacy. On these points, the Court expressly reiterated the limits on mass surveillance set out in last year’s Digital Rights judgment (discussed here) on the validity of the EU’s data retention Directive. Furthermore, the absence of legal remedies in this regard compromises the essence of the right to judicial protection set out in the EU Charter. But the Commission made no findings to this effect.

Secondly, the restriction upon DPAs taking action to prevent data transfers in the event of an inadequate level of data protection in the USA (in the context of Safe Harbour) was also invalid. The Commission did not have the power under the data protection Directive (read in light of the Charter) to restrict DPA competence in that way. Since these two provisions were inseparable from the rest of the Safe Harbour decision, the entire Decision is invalid. The Court did not limit the effect of its ruling.

Comments

The Court’s judgment comes to the same conclusion as the Advocate-General’s opinion, but with subtle differences that I’ll examine as we go along. On the first issue, the Court’s finding that DPAs must be able to stop data flows if there is a breach of EU data protection laws in a third country, despite an adequacy Decision by the Commission, is clearly the correct result. Otherwise it would be too easy for the standards in the Directive to be undercut by means of transfers to third countries, which the Commission or national authorities might be willing to accept as a trade-off for a trade agreement or some other quid pro quo with the country concerned.

As for the Court’s discussion of the architecture of the data protection rules, the idea of the data protection authorities having to go to a national court if they agree with the complainant that the Commission’s adequacy decision is legally suspect is rather convoluted, since it’s not clear who the parties would be: it’s awkward that the Commission itself would probably not be a party.  It’s unfortunate that the Court did not consider the alternative route of the national DPA calling on the Commission to amend its decision, and bringing a ‘failure to act’ proceeding directly in the EU courts if it did not do so. In the medium term, it would be better for the future so-called ‘one-stop shop’ system under the new data protection Regulation (see discussion here) to address this issue, and provide for a centralised process of challenging the Commission directly.

It’s interesting that the CJEU finds that there can be a national decision on adequacy of data flows to third States, since there’s no express reference to this possibility in the Directive. If such a decision is adopted, or if Member States apply the various mandatory and optional exceptions from the general external data protection rules set out in Article 26 of the data protection Directive, much of the Court’s Schrems ruling would apply in the same way by analogy. In particular, national DPAs must surely have the jurisdiction to examine complaints about the validity of such decisions too. But EU law does not prohibit the DPAs from finding the national decisions invalid; the interesting question is whether it obliges national law to confer such power upon the DPAs. Arguably it does, to ensure the effectiveness of the EU rules. Any decisions on these issues could still be appealed to the national courts, which would have the option (though not the obligation, except for final courts) to ask the CJEU to interpret the EU rules.

As for the validity of the Safe Harbour Decision, the Court’s interpretation of the meaning of ‘adequate’ protection in third States should probably be sung out loud, to the tune of ‘We are the World’. The global reach of the EU’s general data protection rules was already strengthened by last year’s Google Spain judgment (discussed here); now the Court declares that even the separate regime for external transfers is very similar to the domestic regime anyway. There must be almost identical degrees of protection, although the Court does hint that modest differences are permissible: accepting the idea of self-certification, and avoiding the issue of whether third States need an independent DPA (the Advocate-General had argued that they did).

It’s a long way from the judgment in Lindqvist over a decade ago, when the Court anxiously insisted that the external regime should not be turned into a copy of the internal rules; now it’s insistent that there should be as little a gap as possible between them. With respect, the Court’s interpretation is not convincing, since the word ‘adequate’ suggests something less than ‘essentially equivalent’, and the EU Charter does not bind third States.

But having said that, the American rules on mass surveillance would violate even a far more generous interpretation of the meaning of the word ‘adequate’. It’s striking that (unlike the Advocate-General), the Court does not engage in a detailed interpretation of the grounds for limiting Charter rights, but rather states that general mass surveillance of the content of communications affects the ‘essence’ of the right to privacy. That is enough to find an unjustifiable violation of the Charter.

So where does the judgment leave us in practice? Since the Court refers frequently to the primary law rules in the Charter, there’s no real chance to escape what it says by signing new treaties (even the planned TTIP or TiSA), by adopting new decisions, or by amending the data protection Directive. In particular, the Safe Harbour decision is invalid, and the Commission could only replace it with a decision that meets the standards set out in this judgment. While the Court refers at some points to the inadequacy or non-existence of the Commission’s findings in the Decision, it’s hard to believe that a new Decision which purports to claim that the American system now meets the Court’s standards would be valid if the Commission were not telling the truth (or if circumstances subsequently changed).

What standards does the US have to meet? The Court reiterates even more clearly that mass surveillance is inherently a problem, regardless of the safeguards in place to limit its abuse. Indeed, as noted already, the Court ruled that mass surveillance of the content of communications breaches the essence of the right to privacy and so cannot be justified at all. (Surveillance of content which is targeted on suspected criminal activities or security threats is clearly justifiable, however). In addition to a ban on mass surveillance, there must also be detailed safeguards in place. The US might soon be reluctantly willing to address the latter, but it will be even more unwilling to address the former.

Are there other routes which could guarantee that external transfers to the USA take place, at least until the US law is changed? In principle, yes, since (as noted above) there are derogations from the general rule that transfers can only take place to countries with an ‘adequate’ level of data protection. A first set of derogations is mandatory (though Member States can have exceptions in ‘domestic law governing particular cases’): where the data subject gives ‘consent unambiguously’; where the transfer is necessary to perform a contract with (or in the interest of) the data subject, or for pre-contractual relations; where it’s ‘necessary or legally required on important public interest grounds’, or related to legal claims; where it’s ‘necessary to protect the vital interests of the data subject’; or where it’s made from a public register. A second derogation is optional: a Member State may authorise transfers where the controller offers sufficient safeguards, possibly in the form of contractual clauses. The use of the latter derogation can be controlled by the Commission.

It’s hard to see how the second derogation can be relevant, in light of the Court’s concerns about the sufficiency of safeguards under the current law. US access to the data is not necessary in relation to a contract, to protect the data subject, or related to legal claims.  An imaginative lawyer might argue that a search engine (though not a social network) is a modern form of public register; but the record of an individual’s use of a search engine is not.

This leaves us with consent and public interest grounds. Undoubtedly (as the CJEU accepted) national security interests are legitimate, but in the context of defining adequacy, they do not justify mass surveillance or insufficient safeguards. Would the Court’s ruling in Schrems still apply fully to the derogation regarding inadequate protection? Or would it apply in a modified way, or not at all?

As for consent, the CJEU ruled last year in a very different context (credibility assessment in LGBT asylum claims) that the rights to privacy and dignity could not be waived in certain situations (see discussion here). Is that also true to some extent in the context of data protection? And what does unambiguous consent mean exactly? Most people believe they are consenting only to (selected) people seeing what they post on Facebook, and are dimly aware that Facebook might do something with their data to earn money. They may be more aware of mass surveillance since the Snowden revelations; some don’t care, but some (like Max Schrems) would like to use Facebook without such surveillance. Would people have to consent separately to mass surveillance? In that case, would Facebook have to be accessible for those who did not want to sign that separate form? Or could a ‘spy on me’ clause be added at the end of a long (and unread) consent form?  Consent is a crucial issue also in the context of the purely domestic EU data protection rules.

The Court’s ruling has addressed some important points, but leaves an enormous number of issues open. It’s clear that it will take a long time to clear up the mess left from this particular poorly supervised party.  


Barnard and Peers: chapter 9

Photo credit: www.businessinsider.com

19 comments:

  1. Thanks for useful analysis. I suspect we are looking down the barrel of a Spookie notice on every website that permits data surveillance, like a cookie notice, requiring user waiver of privacy rights as condition of active use (eg to post a comment).

    ReplyDelete
    Replies
    1. Thanks, Paul. A 'Spookie' notice could perhaps work, but only if the full extent of the programme is clear - and the US is saying that the Advocate-General's conclusions were unfounded: http://www.euractiv.com/sections/infosociety/us-slams-ecj-advisors-safe-harbour-opinion-318042

      Delete
  2. In terms of consent, one could argue that signing up to Facebook means one is prepared to take certain risks.

    Just for those who think this ruling is a bad thing for Facebook, Ben Wright makes clear it may actually be a good thing, as the "big boys", like Facebook, can afford the extra requirements:
    "Certainly consumers may end up paying a price for the ECJ ruling. The largest digital companies will be able to afford to set up data centres and, potentially, separate operations in Europe. However, without “safe harbour”, it will take longer for smaller companies and start-ups to spread across the pond. It could also hit companies doing things as mundane as transferring payroll data or information for marketing campaigns. The political ramifications could spread even wider; the already fraught negotiations on the US-EU trade pact will now be fraughter still."

    http://www.telegraph.co.uk/finance/newsbysector/mediatechnologyandtelecoms/digital-media/11915673/Do-you-want-free-Facebook-or-a-say-in-where-your-personal-data-is-stored-its-unlikely-you-will-have-both.html

    ReplyDelete
    Replies
    1. Thanks for your comments, Pieter. The Directive requires 'unambiguous' consent for the derogation re transfers of data despite inadequate protection in third countries to apply. I think this means that the risks must be specified to some extent. Also I have trouble seeing how separate data centres could work for those with Facebook friends etc in the USA - there would have to be some data processing there in that case. Would the PRISM programme extend even to payroll data and all companies' marketing data? Hard to know when the US says that the allegations are inaccurate anyway.

      Delete
  3. Thanks for providing an interesting and well-thought out piece Steve. Two quick observations: (i) Taking Pieter's point, does the ruling which in effect externalises the costs onto individuals not leave us with the perennial problem of detecting breaches in the first instance? (ii) it seems odd that while we celebrate the striking down of the Commission's Decision, we seem almost powerless when it comes to Care.data and not dissimilar marginalisation of our fundamental rights.

    ReplyDelete
    Replies
    1. Thanks for your comments, Joseph. Yes, there will always be the problem of detecting breaches, though it applies in many different contexts. I don't know an easy way to solve it. Secondly, people will have to keep complaining and litigating about care data and other alleged breaches of data protection law. It's now clear enough from CJEU case law that the law is much stricter than was previously thought, and the ICO and courts in the UK (and elsewhere) ought to be taking this into account.

      Delete
  4. Facebook does not allow me to post this page.

    "You can't post this because it has a blocked link

    The content you're trying to share includes a link that our security systems detected to be unsafe:

    http://eulawanalysis.blogspot.com/2015/10/the-partys-over-eu-data-protection-law.html

    Please remove this link to continue.
    If you think you're seeing this by mistake, please let us know."

    ReplyDelete
    Replies
    1. Thanks for informing me, I have heard this from several other sources too. (Have you tried linking with another domain link, ie .be or .fr?). FB have also twice removed the link to this blog post from the blog FB page. As far as I can see, this only relates to this particular post, not any others. You might very well think this is suspicious; I couldn't possibly comment..

      Delete
  5. Amazon released a response to this ruling, which claims they are exempt:

    > we’d like to confirm for customers and partners that they can continue to use AWS to transfer their customer content from the EEA to the US, without altering workloads, and in compliance with EU law. This is possible because AWS has already obtained approval from EU data protection authorities (known as the Article 29 Working Party)

    What does that mean?

    Link: https://blogs.aws.amazon.com/security/post/Tx3QAALRNBIK9K1/Customer-Update-AWS-and-EU-Safe-Harbor

    ReplyDelete
    Replies
    1. Thanks for your question, Daniel. The Article 29 working party is a group of national data protection regulators which issues non-binding opinions. They haven't responded to the Safe Harbour ruling yet, so Amazon is jumping the gun here. More specifically, Amazon is saying it's OK because it's using model contractual clauses. In the fourth last para above I dismiss the possible use of such clauses as a solution, because they can only be used if there are sufficient safeguards, according to the directive. It's obvious from the Court's rulings that it would regard the safeguards as insufficient, at least as long as the NSA could potentially get hold of that data. A national data protection authority has already endorsed this view, see: https://castlebridge.ie/news/2015/10/14/schleswig-holstein-model-clauses-ist-kaput-update-1

      Delete
  6. Google have been sending out something similar, also based on the idea that 'everything is fine so long as you use the model clauses' as of yesterday:

    "European Safe Harbor ruling update and Google Apps

    Hello Apps Administrator,

    Please note that the update below is relevant only if you process personal data and European Data Protection laws apply to that processing. This will often be the case if your business is based in the European Union. If you are unsure whether this applies to you, we suggest you seek advice from legal counsel.

    On October 6, 2015, Europe’s highest court declared that the decision of the European Commission regarding the US-EU Safe Harbor framework―one of the legal mechanisms that enables the transfer of personal data from the EU to US companies―is invalid, on the basis that Safe Harbor doesn’t provide an adequate level of protection for personal data originating in the EU.

    Through 2015, the European Commission and the US have been negotiating a revised Safe Harbor agreement that should address these concerns, but they were not able to finalize the agreement before the court issued its ruling. Both the Commission and the US have committed to finalizing the revised agreement as soon as possible.

    In the meantime, we’d like to reassure you that we offer a compliance alternative to the Safe Harbor framework and have done so since 2012. Specifically, we offer a data processing amendment and model contract clauses as an additional means―beyond the Safe Harbor framework―of meeting the adequacy and security requirements of the EU Data Protection Directive. Model contract clauses were created specifically by the European Commission to permit the transfer of personal data from Europe.

    Many Google Apps customers have already adopted the data processing amendment and model contract clauses. If you have not already done so, we’d like to remind our Google Apps customers to consider opting-in to the data-processing amendment and model contract clauses. Instructions are available in the Help Center.

    We are committed to helping our customers address their regulatory compliance needs in this area, and we thank you for entrusting your data to Google.

    If you have additional questions, please contact your Google representative or Google Apps Support.

    Sincerely,
    The Google Apps Team"

    ReplyDelete
    Replies
    1. Thanks for this. This is equally as flawed as the Amazon position, for exactly the same reason.

      Delete
    2. Thanks. I notice that the Article 29 Working Group has taken a position that is directly counter to yours, in that it explicitly says that transfers using the Model Clauses remain valid. See http://ec.europa.eu/justice/data-protection/article-29/press-material/press-release/art29_press_material/2015/20151016_wp29_statement_on_schrems_judgement.pdf

      Delete
    3. Indeed. I notice that they give no legal reasoning for their opinion. Technically the CJEU has not invalidated model clauses in the Schrems judgment because they were not the subject-matter of the judgment, but the point is that it is hard to distinguish them from Safe Harbour as regards the question of adequate guarantees. So while companies might wish to rely on them because they are currently valid, I think there is a reasonable chance that they would be ruled invalid if the case arose.

      Delete
    4. Wouldn't transfers of personal data under the model clauses still be legal if the personal data was (effectively) encrypted both "in motion" and "at rest"? That would appear to negate the mass surveillance?

      Delete
    5. @Anonymous: if the encryption key is never transmitted to the countries where the data is, then there is no practical benefit from transmitting the data aside from backup. (Homomorphic encryption isn't as powerful as we might wish.) Why would a big company want to store data about EU nationals on US servers if those servers could never decrypt the data and were simple storage systems?

      On the other hand, if the encryption key is ever transmitted to where the data is, we're back to the original problems since local authorities could access the key (and then the data) in ways that contradict EU privacy rules.

      Delete
  7. This comment has been removed by a blog administrator.

    ReplyDelete
  8. This comment has been removed by a blog administrator.

    ReplyDelete
  9. 360 Degree Image Editing Service Are you looking at the ghost mannequin effect service? We provide the invisible man23 February 2022 at 02:45

    This comment has been removed by a blog administrator.

    ReplyDelete