Artificial Intelligence and identifying Controllers and Processors - 2021 may bring further clarity

The ICO has just published new guidance on Artificial Intelligence (AI). I wrote a detailed blog piece on the consultation version so I will not write about it generally now. However, some important new information is that the ICO has acknowledged  that AI systems involve a number of organisations and so working out who is a Controller or Processor for the purposes of data protection law can become complex. The ICO plans to address these issues in more detail when it revises its Cloud Computing Guidance in 2021. It will consult with stakeholders because of the questions of policy raised. Most helpfully, the new guidance will include example scenarios covering when an organisation is a Controller or Processor in the context of AI services.


The Fall of the Privacy Shield – What next? The ICO and the EDPB have different approaches

The decision of the European Court of Justice in Case C-311/18 – Data Protection Commissioner v Facebook Ireland and Maximillian Schrems struck down the EU-US Privacy Shield with immediate effect. This caused gasps of horror in the data protection world and more widely as people realised just how significant this judgement was and just how often the Privacy Shield was the mechanism under which personal data was being transferred to the US. Initial articles on the subject sometimes mentioned using Standard Contractual Clauses (SCCs) instead as they appeared to have survived the Court’s scrutiny. Then there was the slow recognition that there were serious question marks over the use of SCCs as US law does not ensure an equivalent level of protection for personal data.

Such were the repercussions of the judgement and the many questions being asked that on 23 July the EDPB (European Data Protection Board) adopted a set of Frequently Asked Questions (FAQ). These are helpful to a point but by no means provide any real answer to the problems caused by the judgment. What is most interesting is the stark contrast between the EDPB’s approach and that of the ICO. The EDPB FAQ document says of the Privacy Shield, ‘Transfers on the basis of this legal framework are illegal’.  However, a visit to the Privacy Shield page of the ICO website reveals a statement as follows:

‘We are currently reviewing our Privacy Shield guidance after the judgment issued by the European Court of Justice on Thursday 16 July 2020.If you are currently using Privacy Shield please continue to do so until new guidance becomes available. Please do not start to use Privacy Shield during this period’

However, continuing to use the Privacy Shield carries risks, even if one assumes that there is no risk of ICO enforcement while the ICO statement remains as above. For example, individuals could bring claims for compensation.  The ICO has not mentioned this but the Berlin Data Protection Authority has referred to it in a press release (which is only available in German) and has also told organisations in Berlin to shift their data storage from the US back to Europe.

IMPORTANT UPDATE: On 27th July 2020, the ICO changed its statement on its website to read:

'The judgment in the Schrems II case issued by the European Court of Justice on Thursday 16 July 2020 found that Privacy Shield is no longer a valid way to transfer personal data outside of the EEA. For more information please read our latest statement.'

This statement brought it into line with the EDPB.


The End of the Brexit Transition Period - Will we get an Adequacy Decision?

The European Commission has issued a communication to all member states about the changes they can expect after 31 December 2020. A small part of this covers data protection.  It explains that there are existing mechanisms under the GDPR to allow transfers of personal data to a third country and that these can be deployed for the UK. However, will one of these be an adequacy decision?

This communication says ‘As underlined in the Political Declaration, the EU will use its best endeavours to conclude the assessment of the UK regime by the end of 2020 with a view to possibly adopting a decision if the United Kingdom meets the applicable conditions. The Commission is currently conducting this assessment and has held a number of technical meetings with the United Kingdom to gather information in order to inform the process’

The clear advice for businesses in the EU is that they should plan ahead including for the scenario where there is no adequacy decision in respect of the UK. A hint perhaps that an adequacy decision may not be forthcoming.


Workplace Testing for Covid-19 - A joint HR and Data Protection Perspective

Kate Grimley Evans and Caroline Banwell of Harmony HR Solutions have collaborated to provide a unique and helpful perspective on Workplace Testing for Covid-19,


Apple and Google joint initiative on COVID-19 contact tracing technology

The Information Commissioner has published her official Opinion on the Contact Tracing Framework being developed by Apple and Google. This opinion should be read with care with its stated limitations taken into account, in particular the fact that it is limited to the Framework itself and does not extend to Apps developed using it. It also only applies to phase 1 of the project and a more expansive phase 2 is already envisaged.

The Commissioner reaches the conclusion that the Framework is aligned with the principles of data protection by design and by default but it is still clear that there are a number of potential data protection risks which could arise from the way in which it is used, in particular if an app processes data outside the intended scope of the Framework. The Information Commissioner's opinion highlights the risk that users of a contact tracing app might not understand that the data protection by design and by default principles used in the Framework do not extend to all aspects of the app.

Most current proposals for contact tracing apps would rely on consent as the lawful basis for processing and the Commissioner points out that it is not yet clear how consent management will work or what the practical implications of a withdrawal of consent are.

To my mind, the biggest risk to the privacy of individuals would potentially be from 'scope creep' which describes the way in which it is possible that third party app developers will expand the use of Covid-19 tracing apps using the Framework beyond that original stated purpose. The Commissioner mentions this in her Opinion and reassuringly says she will monitor all developments.


Morrisons wins its Supreme Court Case

In a judgement issued on 1st April 2020, the Supreme Court held that Morrisons (i.e WM Morrison Supermarkets plc) was not vicariously liable for the actions of its employee when, motivated by vengeance against Morrisons, he placed the personal data of thousands of Morrisons employees onto a public file sharing site. The judgement contains a long discussion of the case law on vicarious liability which will be of most interest to employment lawyers. The key vicarious liability point was that the employee's bad motive was highly relevant. There is, however, a key data protection point which is that, despite the conclusion on the facts that Morrisons was not vicariously liable, there was nothing in the DPA 1998 which excluded the operation of vicarious liability. On the basis that a similar conclusion is likely to be reached in relation to the Data Protection Act 2018, there is still the distinct possibility that, in the future, on a different set of facts, an employer will find itself vicariously liable for a data protection breach committed by its employee.


Coronavirus update emails - What is the law?

My husband recently commented on what he described as 'huge quantities of Corona spam' arriving in his inbox and asked me if the companies were allowed to send him these. Some appeared to be motivated by a genuine need to communicate information relating to Coronavirus measures, others were perhaps at least to some extent using a marketing opportunity. This off the cuff comment prompted an unsolicited full explanation of the law from me.  After my husband had recovered, it occurred to me that this information would be very helpful to businesses (or those with an inquisitive mind).

An email of this kind needs to comply with both data protection law and the Privacy and Electronic Communications Regulations (PECR). It is easiest to start with the PECR. Basically, if an email contains marketing then it usually requires consent. Commercial companies may be able to contact existing customers using a 'soft opt in' rule but this only works if approached carefully and should not be attempted without careful study of the ICO guidance on the subject. Charities cannot use the 'soft opt 'in at all. Many organisations have correctly identified that if a message does not contain any marketing then it is not caught by the PECR. A message about Coronavirus measures alone is just a service message not marketing but companies must be careful not to stray into marketing in the same message, for example by straying from an update on the general availability of goods to advertising certain items.

Even a service message not caught by the PECR must comply with data protection law. Somebody's name and email are being used to contact them and this is a personal data use. This means that the person must reasonably expect such use (consider your privacy notices) and a condition under article 6 of the GDPR must be met. Examples would be that the person has consent, that the company has a legitimate interest in contacting them or that the email contact is necessary for the performance of a contract with that person. Where legitimate interest is relied on, there must be a Legitimate Interest Assessment in place.

In short, desperate times but not an excuse for desperate measures - Approach such emails with care. The ICO's draft Code on Direct Marketing is a helpful resource.


Cathay Pacific Airlines Data Breach - 12 lessons

The monetary penalty notice issued by the Information Commissioner's Office identified 12 failings in Cathay Pacific's security measures. I have extracted these as a simple list and suggest that all organisations check for similar issues:

  1. Database back-ups not encrypted.
  2. An internet-facing server was accessible due to a known and publicised vulnerability. Both the vulnerability and the fix had been public knowledge for around 10 years.
  3.  The administrator console was publicly available on the internet.
  4.  One system was hosted on an operating system that was no longer supported.
  5.  There was no evidence of adequate server hardening (a process for removing unnecessary features to minimise attack points).
  6. Network users were permitted to authenticate past the VPN without multifactor authentication.
  7. Anti-virus protection was  inadequate.
  8. Patch management was inadequate.
  9.  Accounts were given inappropriate privileges.
  10.  Penetration testing was inadequate.
  11.  Retention periods were too long.
  12.  There was a failure to manage security solutions which it did have in place or to adhere to its own policies.

 


ICO publishes Draft Guidance on the AI Auditing Framework

This draft guidance relating to applications of artificial intelligence has recently been published for consultation. The guidance is long and detailed. For the most part it is a helpful synopsis of data protection law as it applies to applications of artificial intelligence but it is also useful for the following reasons:

1. It points out where terms are used differently in the AI context from how they are used in a data protection context, thus avoiding confusion.

2. It has specific contextual examples including practical help in minimising risk

3. It points out common pitfalls e.g. the need to treat the training phase differently from the implementation phase when deciding on purpose and the lawful bases for processing.

4. It covers other aspects of law such as the potential for discrimination where discrimination was inherent in the data used to train the models.

5. It points out key dangers e.g. the possibility of model inversion attacks in which attackers are able to recover personal data about the people whose data was used to train the system.

6. It covers how to deal with individual data protection rights, which can be particularly challenging in this context.

In short, this guidance is a ‘must-read’ for all involved in AI applications. The ICO itself identifies the intended audience:

Those with a compliance focus, including:

• data protection officers
• general counsel
• risk managers
• the ICO’s own auditors

Technology specialists, including:

• machine learning developers and data scientists
• software developers/engineers
• cyber security and IT risk managers


EDPB adopts new Guidelines on Video Surveillance

The European Data Protection Board has recently (29 January 2020) adopted its Guidelines on Video Surveillance. There have been changes since the consultation version. These guidelines are very detailed but helpful. Useful real life examples are included and all aspects of data protection are covered. This piece is just an overview of very detailed guidance and those needing the full detail should read the guidance.

Video Surveillance for personal or household activity

The Guidance starts by covering the use of video surveillance for purely personal or household activity, such purposes being outside the scope of the GDPR. However, care must be taken not to assume that the GDPR is irrelevant to any home use. The Guidance stresses:

‘12. This provision – the so-called household exemption – in the context of video surveillance must be narrowly construed. Hence, as considered by the European Court of Justice, the so called “household exemption” must “be interpreted as relating only to activities which are carried out in the course of private or family life of individuals, which is clearly not the case with the processing of personal data consisting in publication on the internet so that those data are made accessible to an indefinite number of people”. Furthermore, if a video surveillance system, to the extent it involves the constant recording and storage of personal data and covers, “even partially, a public space and is accordingly directed outwards from the private setting of the person processing the data in that manner, it cannot be regarded as an activity which is a purely ‘personal or household’ activity.’

It is worth reading the guidance for further examples of when home use can be within the GDPR.

Lawfulness of Processing

The guidance makes it clear that the requirement for transparency means being specific about the reason for the use of cameras:

‘Video surveillance based on the mere purpose of “safety” or “for your safety” is not sufficiently specific (Article 5 (1) (b)). It is furthermore contrary to the principle that personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject (see Article 5 (1) (a)).’

In practice, the use of cameras will usually be justified on the basis of legitimate interests and the guidance helpfully reminds the reader to make a proper assessment:

‘Given a real and hazardous situation, the purpose to protect property against burglary, theft or vandalism can constitute a legitimate interest for video surveillance.

20. The legitimate interest needs to be of real existence and has to be a present issue (i.e. it must not be fictional or speculative). A real-life situation of distress needs to be at hand – such as damages or serious incidents in the past – before starting the surveillance. In light of the principle of accountability, controllers would be well advised to document relevant incidents (date, manner, financial loss) and related criminal charges. Those documented incidents can be a strong evidence for the existence of a legitimate interest. The existence of a legitimate interest as well as the necessity of the monitoring should be reassessed in periodic intervals (e. g. once a year, depending on the circumstances).’

In practice, such analysis and the second stage of balancing the legitimate interests of the Controller against those of the individuals whose images are captured can be done using a Legitimate Interest Assessment following the template on the ICO website. Part of the exercise in balancing the legitimate interests of the Controller against those of the individual will involve an assessment of the reasonable expectations of the individual. This is where the new guidance makes some very useful comments:

‘Data subjects can also expect to be free of monitoring within publicly accessible areas especially if those areas are typically used for recovery, regeneration, and leisure activities as well as in places where individuals stay and/or communicate, such as sitting areas, tables in restaurants, parks, cinemas and fitness facilities. Here the interests or rights and freedoms of the data subject will often override the controller’s legitimate interests.’

‘Signs informing the data subject about the video surveillance have no relevance when determining what a data subject objectively can expect. This means that e.g. a shop owner cannot rely on customers objectively having reasonable expectations to be monitored just because a sign informs the individual at the entrance about the surveillance.’

Consent

Everybody has probably seen a sign along the lines of ‘ by entering these premises you consent to video surveillance’ and considered that they are being presented with little choice in practice. The new guidance is very helpful on this issue:

’44 Regarding systematic monitoring, the data subject’s consent can only serve as a legal basis in accordance with Article 7 (see Recital 43) in exceptional cases [My emphasis]. It is in the surveillance’s nature that this technology monitors an unknown number of people at once. The controller will hardly be able to prove that the data subject has given consent prior to processing of its personal data (Article 7 (1)). Assumed that the data subject withdraws its consent it will be difficult for the controller to prove that personal data is no longer processed (Article 7 (3)).’

‘46 If the controller wishes to rely on consent it is his duty to make sure that every data subject who enters the area which is under video surveillance has given her or his consent. This consent has to meet the conditions of Article 7. Entering a marked monitored area (e.g. people are invited to go through a specific hallway or gate to enter a monitored area), does not constitute a statement or a clear affirmative action needed for consent, unless it meets the criteria of Article 4 and 7 as described in the guidelines on consent.'

'47. Given the imbalance of power between employers and employees, in most cases employers should not rely on consent when processing personal data, as it is unlikely to be freely given. The guidelines on consent should be taken into consideration in this context.’

Special category data including biometric data

The new guidance clarifies that a video surveillance system is not processing special category data just because, for example, it captures a wheelchair user (i.e potentially information about health). However, if the system is being used for the purpose of processing health information e.g. in a hospital setting, then it would be processing special category data and the usual need for justifications under both articles 6 and 9 of the GDPR apply.

Similarly images of faces do not automatically count as biometric data, only when the image is used to identify an individual does it become biometric data.

Coming back to the issue of consent in the context of the processing of biometric data, the new guidance is very clear:

‘Finally, when the consent is required by Article 9 GDPR, the data controller shall not condition the access to its services to the acceptance of the biometric processing. In other words and notably when the biometric processing is used for authentication purpose, the data controller must offer an alternative solution that does not involve biometric processing – without restraints or additional cost for the data subject.’

Subject Access Requests

There is much coverage of this topic including expectations as to redaction of other individuals but one of the most helpful snippets is the statement that an individual ought to be assisting the controller to find the requested images:

‘…… the data subject should (besides identifying themselves including with identification document or in person) in its request to the controller, specify when – within a reasonable timeframe in proportion to the amount of data subjects recorded – he or she entered the monitored area. The controller should notify the data subject beforehand on what information is needed in order for the controller to comply with the request.’

Privacy Notice/Signage

Anybody who has already considered this issue will realise that displaying a compliant full privacy notice in relation to the use of cameras is most impractical given what such a notice should cover. The new guidance supports the use of a layered approach with the most important information on the initial sign. However, it should be noted that the expectation is still for a considerable amount of information on the initial sign so ‘Cameras in Use – for further details see our Privacy Notice’ is not good enough. An example sign is included in the guidance.

Much of this new guidance is, in practice, already covered by the available ICO guidance, including the need to assess whether the system is needed at all and to make sure that the system is only used for the stated purpose. However, the new guidance adds an important layer of extra detail and clarification.