Choose your location?
  • Global Global
  • Australian flag Australia
  • French flag France
  • German flag Germany
  • Irish flag Ireland
  • Italian flag Italy
  • Polish flag Poland
  • Qatar flag Qatar
  • Spanish flag Spain
  • UAE flag UAE
  • UK flag UK

DWF Data Protection Insights July 2021

02 August 2021

Here is our round-up of the month's top data protection stories, together with practical advice on how to address the legal issues raised.  

This month's highlights include:

  • the European Commission's adequacy decisions in respect of the UK;
  • the continued focus by the UK and EU data protection authorities on Artificial Intelligence; and
  • the work of the ICO and DCMS/the CDEI on protecting children online.

This month's top story

This month's top news is the European Commission's adequacy decisions in respect of the UK. On 28 June the European Commission published the long-awaited UK adequacy decisions, so personal data transfers from the EEA to the UK (or from the UK to the EEA and back to the UK) can continue without the need for additional safeguards.  However, organisations which rely on such transfers should exercise caution for the following reasons:

  • The decisions only last for four years, during which the Commission will monitor the UK, and take action if the UK deviates from the level of protection in place.  This means that the Commission could revoke the decisions at any time or refuse to renew the decisions once the four years expire.
  • The Prime Ministerial Taskforce on Innovation, Growth, and Regulatory Reform (TIGRR) recently recommended replacing the UK GDPR with a UK Data Protection Framework, and the Prime Minister sent a response which seemed to approve of this approach.  If this proposal goes ahead, it could impact the adequacy decision, depending on the extent of the changes.   
  • Privacy campaigners may challenge the decisions in the courts.  

While the decisions are good news, we recommend that organisations continue to prepare for the possibility that they could expire in four years or be invalidated sooner.  You may want to include a mechanism in your contracts for dealing with this situation, for example by agreeing to enter into standard contractual clauses, and give the controller the right to suspend or terminate processing if the parties cannot resolve the situation.  Please contact one of our data protection specialists for advice on how to deal with this.

Regulatory guidance/campaigns/other news from the Information Commissioner's Office (ICO)/European Data Protection Board (EDPB)/ European Data Protection Supervisor (EDPS)

EDPB guidance and news

EDPB publishes three sets of guidelines 

On 8 July the EDPB announced that it had adopted the following documents:

  • Final version of the guidelines on the concepts of Controller and Processor.  See the September 2020 issue of DWF Data Protection Insights for our overview of the draft version published for consultation.
  • Final version of the Guidelines on Virtual Voice Assistants.  See the March 2021 issue of DWF Data Protection Insights for our overview of the draft version.
  • A consultation draft of Guidelines on Codes of Conduct as a tool for transfers.  The ICO has published a link to these guidelines, stating that they apply to the EU GDPR, the link is provided as a useful reference, and the ICO will be producing its own guidance on this topic in due course.

Artificial intelligence key updates

In recent months, the UK and EU data protection authorities have focused on artificial intelligence (AI) regulation and guidance.  Read our round-up of the key updates >

ICO guidance and news

ICO releases beta version of AI and Data Protection Risk Toolkit

In the March 2021 issue of DWF Data Protection Insights, we reported on the ICO's launch of an alpha version of an AI and data protection risk mitigation and management toolkit for consultation.  On 20 July the ICO published the beta version of the toolkit and announced that in the next stage of its development it wants to work with organisations to test the toolkit on live examples of AI systems.

The beta version is broadly as described in our March 2021 article, but it has been refined in two key ways.  Firstly, the toolkit follows four stages of the AI lifecycle, plus a section on procurement including business requirements and design, data acquisition and preparation, training and testing and deployment and monitoring.

For each stage of the AI lifecycle, the toolkit identifies the "risk domain areas", which are now:

  1. Accountability and governance
  2. Lawfulness and purpose limitation
  3. Fairness (Statistical accuracy, bias and discrimination)
  4. Transparency
  5. Security
  6. Data Minimisation
  7. Individual rights
  8. Meaningful human review
  9. Lawfulness and purpose limitation
  10. Fairness (Statistical accuracy, bias and discrimination)

For each risk domain area, the report identifies practical steps to mitigate the risks identified.  The guide to using the toolkit reminds users that if you identify a residual high risk that you cannot mitigate, you must consult with the ICO before you start processing.

As we said in our March article, while the toolkit looks like a useful tool, it is by necessity generic and high-level. If you would like tailored advice on a specific AI project, please contact one of our data protection specialists.

ICO blog post: Spotlight on the Children’s Code standards - best interests of the child, detrimental use of children’s data and data minimisation

On 28 June the ICO published a blog post focusing on three principles of its Children's Code.  Click here to read our overview of the key points of this post, DCMS's business guide for protecting children on online platforms, and the Dutch DPA's decision about providing privacy notices to children in their own language >

ICO blog post: What’s next for the Accountability Framework?

On 15 July the ICO published a blog post about their plans for their Accountability Framework.  This states that they have amended the Framework to reflect feedback and make it easier to navigate and use.  The ICO now wants to add some real-life case studies, and asks organisations to share best practice examples.

Enforcement action

ICO enforcement

ICO fines charity £25,000 for failing to keep users' data secure

The ICO has imposed a rare fine for breach of the UK GDPR (as opposed to the Privacy and Electronic Communications Regulations (PECR)) on a charity for failing to protect the personal data of users of its internal email group.  Insufficiently secure settings caused approximately 780 pages of confidential emails to be viewable online for nearly three years. This led to personal information about 550 people being searchable online, including some users' special category data, such as information about their mental and physical health and sexual orientation.

The ICO’s investigation found that the charity:

  • should have applied restricted access to its email group and could have considered pseudonymisation or encryption to add an extra layer of protection to the personal data it held; and
  • had a negligent approach towards data protection with inadequate policies and a lack of training for staff.

ICO fines for PECR breaches

The ICO has continued to focus on fining organisations for breaches of the Privacy and Electronic Communications Regulations (PECR).  It has:

  • fined one company £200,000 for making more than 11 million unlawful claims management calls; and
  • fined another company £130,000 for making over 900,000 nuisance marketing calls, including a considerable number to numbers registered on the Telephone Preference Service (TPS) and the Corporate Telephone Preference Service (CTPS).

These fines provide a reminder to conduct telephone marketing in accordance with data protection law, including PECR, and to screen your list against the TPS and/or CTPS (as relevant).  If you want advice on how to conduct direct marketing in accordance with the law, please contact one of our privacy specialists.

ICO enforcement notice for GDPR breaches

The ICO has issued an email marketing company with an enforcement notice in relation to a number of breaches of the UK GDPR, including:

 

  • breaching the transparency principle by failing to provide a clear privacy notice to inform data subjects who their personal data would be shared with; and
  • incorrectly relying on consent, when that consent was not specific and informed, as required by the GDPR.  

The ICO considered that the breaches were linked to the company's incorrect claim to be a processor, whereas it was in fact a controller.  Identifying the roles and relationship of parties collecting, processing and sharing personal data can be complex, so we recommend that organisations take specialist legal advice.  Please contact one of our data protection and privacy lawyers for assistance.

Industry news

Algorithms and transparency in the public sector

The Centre for Data Ethics and Innovation (CDEI) has published a blog post which examines how the public sector can increase transparency around the use of algorithms in decision-making to build public trust.  Click here to read our summary of the key points >

CDEI publishes report on the role of data intermediaries 

On 22 July the CDEI published a report 'Unlocking the value of data: Exploring the role of data intermediaries'.  Click here to read our overview >

CDEI publishes beta version of PETs adoption guide

On 14 July the CDEI published a blog post launching the beta version of its PETs adoption guide. The PETs in question are not furry companions, but  privacy enhancing technologies, and the guide is an interactive tool intended to:

  • aid decision-making around the use of PETs in data-driven projects through a Q&A decision tree;
  • help practitioners think about how these technologies could be useful to them;
  • signpost relevant technical resources and example use-cases; and 
  • highlight some of the limitations and challenges of PETs to illuminate where they may not be the most suitable solution.

When the user has worked through the decision tree, the guide will:

  • suggest PETs that could provide appropriate solutions;
  • provide information on the practicalities and potential challenges in adopting them; and 
  • provide signposts to relevant technical resources and example use-cases from the repository.

The CDEI blog post acknowledges that PETs are not a "silver bullet" solution to privacy concerns, but should be treated as one tool in an organisation's kit for handling personal data safely and securely.  It also identifies that PETs can introduce accountability risks:

  • they can lead to a false sense of security;
  • they will not prevent unethical data gathering, processing or outcomes;
  • some stakeholders expressed concerns about PETs entrenching monopolistic power in the private sector, given that large amounts of data and technical resource are required for some PETs to be effective; and
  • there are uncertainties around how the use of some PETs should be interpreted in law, e.g. regarding what constitutes a sufficient level of anonymisation for data to no longer be considered personal data.  See our article ICO call for views: Anonymisation, pseudonymisation and privacy enhancing technologies guidance for details of the latest position.

For advice on all aspects of data governance, including the adoption of PETs, please contact one of our privacy specialists.

DCMS launches consultation on operation of digital identity system

On 19 July DCMS launched a consultation on its digital identity governance framework, seeking views on three issues:

  • the scope and remit of the body which will oversee digital identity and ensure compliance with the framework;
  • how to allow trusted organisations to digitally check authoritative government-held data; and
  • how to establish the legal validity of, and build confidence in, digital identities and attributes.

In the June 2021 issue of DWF Data Protection Insights we reported on the European Commission's proposal for a  digital identity framework and commented that it would be interesting to see whether there will be any crossover between the two systems.  We will monitor developments and report in future issues of DWF Data Protection Insights.

Further Reading