Big Fish v. Small Fish - Bargaining Position as between Controller and Processor

The European Data Protection Board is consulting on its Guidelines on the concepts of controller and processor in the GDPR until 19 October 2020.  They are long and detailed and only likely to be read by those with a serious interest in the GDPR. I thought I would highlight what I found to be a very interesting paragraph (107) in relation to data processing agreements which may be included in a service provider's standard terms and conditions.

'The fact that the contract and its detailed terms of business are prepared by the service provider rather
than by the controller is not in itself problematic and is not in itself a sufficient basis to conclude that
the service provider should be considered as a controller. Also, the imbalance in the contractual power
of a small data controller with respect to big service providers should not be considered as a
justification for the controller to accept clauses and terms of contracts which are not in compliance
with data protection law, nor can it discharge the controller from its data protection obligations. The
controller must evaluate the terms and in so far as it freely accepts them and makes use of the service,
it has also accepted full responsibility for compliance with the GDPR. Any proposed modification, by a
processor, of data processing agreements included in standard terms and conditions should be directly
notified to and approved by the controller. The mere publication of these modifications on the
processor’s website is not compliant with Article 28.'

In short, a small controller cannot use its poor bargaining position as an excuse for poor GDPR compliance but large processors cannot abuse their bargaining position by simply dictating terms.

Artificial Intelligence and identifying Controllers and Processors - 2021 may bring further clarity

The ICO has just published new guidance on Artificial Intelligence (AI). I wrote a detailed blog piece on the consultation version so I will not write about it generally now. However, some important new information is that the ICO has acknowledged  that AI systems involve a number of organisations and so working out who is a Controller or Processor for the purposes of data protection law can become complex. The ICO plans to address these issues in more detail when it revises its Cloud Computing Guidance in 2021. It will consult with stakeholders because of the questions of policy raised. Most helpfully, the new guidance will include example scenarios covering when an organisation is a Controller or Processor in the context of AI services.

The Fall of the Privacy Shield – What next? The ICO and the EDPB have different approaches

The decision of the European Court of Justice in Case C-311/18 – Data Protection Commissioner v Facebook Ireland and Maximillian Schrems struck down the EU-US Privacy Shield with immediate effect. This caused gasps of horror in the data protection world and more widely as people realised just how significant this judgement was and just how often the Privacy Shield was the mechanism under which personal data was being transferred to the US. Initial articles on the subject sometimes mentioned using Standard Contractual Clauses (SCCs) instead as they appeared to have survived the Court’s scrutiny. Then there was the slow recognition that there were serious question marks over the use of SCCs as US law does not ensure an equivalent level of protection for personal data.

Such were the repercussions of the judgement and the many questions being asked that on 23 July the EDPB (European Data Protection Board) adopted a set of Frequently Asked Questions (FAQ). These are helpful to a point but by no means provide any real answer to the problems caused by the judgment. What is most interesting is the stark contrast between the EDPB’s approach and that of the ICO. The EDPB FAQ document says of the Privacy Shield, ‘Transfers on the basis of this legal framework are illegal’.  However, a visit to the Privacy Shield page of the ICO website reveals a statement as follows:

‘We are currently reviewing our Privacy Shield guidance after the judgment issued by the European Court of Justice on Thursday 16 July 2020.If you are currently using Privacy Shield please continue to do so until new guidance becomes available. Please do not start to use Privacy Shield during this period’

However, continuing to use the Privacy Shield carries risks, even if one assumes that there is no risk of ICO enforcement while the ICO statement remains as above. For example, individuals could bring claims for compensation.  The ICO has not mentioned this but the Berlin Data Protection Authority has referred to it in a press release (which is only available in German) and has also told organisations in Berlin to shift their data storage from the US back to Europe.

IMPORTANT UPDATE: On 27th July 2020, the ICO changed its statement on its website to read:

'The judgment in the Schrems II case issued by the European Court of Justice on Thursday 16 July 2020 found that Privacy Shield is no longer a valid way to transfer personal data outside of the EEA. For more information please read our latest statement.'

This statement brought it into line with the EDPB.

The End of the Brexit Transition Period - Will we get an Adequacy Decision?

The European Commission has issued a communication to all member states about the changes they can expect after 31 December 2020. A small part of this covers data protection.  It explains that there are existing mechanisms under the GDPR to allow transfers of personal data to a third country and that these can be deployed for the UK. However, will one of these be an adequacy decision?

This communication says ‘As underlined in the Political Declaration, the EU will use its best endeavours to conclude the assessment of the UK regime by the end of 2020 with a view to possibly adopting a decision if the United Kingdom meets the applicable conditions. The Commission is currently conducting this assessment and has held a number of technical meetings with the United Kingdom to gather information in order to inform the process’

The clear advice for businesses in the EU is that they should plan ahead including for the scenario where there is no adequacy decision in respect of the UK. A hint perhaps that an adequacy decision may not be forthcoming.

Workplace Testing for Covid-19 - A joint HR and Data Protection Perspective

Kate Grimley Evans and Caroline Banwell of Harmony HR Solutions have collaborated to provide a unique and helpful perspective on Workplace Testing for Covid-19,

Apple and Google joint initiative on COVID-19 contact tracing technology

The Information Commissioner has published her official Opinion on the Contact Tracing Framework being developed by Apple and Google. This opinion should be read with care with its stated limitations taken into account, in particular the fact that it is limited to the Framework itself and does not extend to Apps developed using it. It also only applies to phase 1 of the project and a more expansive phase 2 is already envisaged.

The Commissioner reaches the conclusion that the Framework is aligned with the principles of data protection by design and by default but it is still clear that there are a number of potential data protection risks which could arise from the way in which it is used, in particular if an app processes data outside the intended scope of the Framework. The Information Commissioner's opinion highlights the risk that users of a contact tracing app might not understand that the data protection by design and by default principles used in the Framework do not extend to all aspects of the app.

Most current proposals for contact tracing apps would rely on consent as the lawful basis for processing and the Commissioner points out that it is not yet clear how consent management will work or what the practical implications of a withdrawal of consent are.

To my mind, the biggest risk to the privacy of individuals would potentially be from 'scope creep' which describes the way in which it is possible that third party app developers will expand the use of Covid-19 tracing apps using the Framework beyond that original stated purpose. The Commissioner mentions this in her Opinion and reassuringly says she will monitor all developments.

Morrisons wins its Supreme Court Case

In a judgement issued on 1st April 2020, the Supreme Court held that Morrisons (i.e WM Morrison Supermarkets plc) was not vicariously liable for the actions of its employee when, motivated by vengeance against Morrisons, he placed the personal data of thousands of Morrisons employees onto a public file sharing site. The judgement contains a long discussion of the case law on vicarious liability which will be of most interest to employment lawyers. The key vicarious liability point was that the employee's bad motive was highly relevant. There is, however, a key data protection point which is that, despite the conclusion on the facts that Morrisons was not vicariously liable, there was nothing in the DPA 1998 which excluded the operation of vicarious liability. On the basis that a similar conclusion is likely to be reached in relation to the Data Protection Act 2018, there is still the distinct possibility that, in the future, on a different set of facts, an employer will find itself vicariously liable for a data protection breach committed by its employee.

Coronavirus update emails - What is the law?

My husband recently commented on what he described as 'huge quantities of Corona spam' arriving in his inbox and asked me if the companies were allowed to send him these. Some appeared to be motivated by a genuine need to communicate information relating to Coronavirus measures, others were perhaps at least to some extent using a marketing opportunity. This off the cuff comment prompted an unsolicited full explanation of the law from me.  After my husband had recovered, it occurred to me that this information would be very helpful to businesses (or those with an inquisitive mind).

An email of this kind needs to comply with both data protection law and the Privacy and Electronic Communications Regulations (PECR). It is easiest to start with the PECR. Basically, if an email contains marketing then it usually requires consent. Commercial companies may be able to contact existing customers using a 'soft opt in' rule but this only works if approached carefully and should not be attempted without careful study of the ICO guidance on the subject. Charities cannot use the 'soft opt 'in at all. Many organisations have correctly identified that if a message does not contain any marketing then it is not caught by the PECR. A message about Coronavirus measures alone is just a service message not marketing but companies must be careful not to stray into marketing in the same message, for example by straying from an update on the general availability of goods to advertising certain items.

Even a service message not caught by the PECR must comply with data protection law. Somebody's name and email are being used to contact them and this is a personal data use. This means that the person must reasonably expect such use (consider your privacy notices) and a condition under article 6 of the GDPR must be met. Examples would be that the person has consent, that the company has a legitimate interest in contacting them or that the email contact is necessary for the performance of a contract with that person. Where legitimate interest is relied on, there must be a Legitimate Interest Assessment in place.

In short, desperate times but not an excuse for desperate measures - Approach such emails with care. The ICO's draft Code on Direct Marketing is a helpful resource.

Cathay Pacific Airlines Data Breach - 12 lessons

The monetary penalty notice issued by the Information Commissioner's Office identified 12 failings in Cathay Pacific's security measures. I have extracted these as a simple list and suggest that all organisations check for similar issues:

  1. Database back-ups not encrypted.
  2. An internet-facing server was accessible due to a known and publicised vulnerability. Both the vulnerability and the fix had been public knowledge for around 10 years.
  3.  The administrator console was publicly available on the internet.
  4.  One system was hosted on an operating system that was no longer supported.
  5.  There was no evidence of adequate server hardening (a process for removing unnecessary features to minimise attack points).
  6. Network users were permitted to authenticate past the VPN without multifactor authentication.
  7. Anti-virus protection was  inadequate.
  8. Patch management was inadequate.
  9.  Accounts were given inappropriate privileges.
  10.  Penetration testing was inadequate.
  11.  Retention periods were too long.
  12.  There was a failure to manage security solutions which it did have in place or to adhere to its own policies.


ICO publishes Draft Guidance on the AI Auditing Framework

This draft guidance relating to applications of artificial intelligence has recently been published for consultation. The guidance is long and detailed. For the most part it is a helpful synopsis of data protection law as it applies to applications of artificial intelligence but it is also useful for the following reasons:

1. It points out where terms are used differently in the AI context from how they are used in a data protection context, thus avoiding confusion.

2. It has specific contextual examples including practical help in minimising risk

3. It points out common pitfalls e.g. the need to treat the training phase differently from the implementation phase when deciding on purpose and the lawful bases for processing.

4. It covers other aspects of law such as the potential for discrimination where discrimination was inherent in the data used to train the models.

5. It points out key dangers e.g. the possibility of model inversion attacks in which attackers are able to recover personal data about the people whose data was used to train the system.

6. It covers how to deal with individual data protection rights, which can be particularly challenging in this context.

In short, this guidance is a ‘must-read’ for all involved in AI applications. The ICO itself identifies the intended audience:

Those with a compliance focus, including:

• data protection officers
• general counsel
• risk managers
• the ICO’s own auditors

Technology specialists, including:

• machine learning developers and data scientists
• software developers/engineers
• cyber security and IT risk managers