• GL
Choose your location?
  • Global Global
  • Australian flag Australia
  • French flag France
  • German flag Germany
  • Irish flag Ireland
  • Italian flag Italy
  • Polish flag Poland
  • Qatar flag Qatar
  • Spanish flag Spain
  • UAE flag UAE
  • UK flag UK

DWF Data Protection Insights February 2021

01 March 2021
Here is our round-up of the month's top data protection stories, together with practical advice on how to address the legal issues raised.

This month's highlights include:

  • the European Commission's publication of draft adequacy decisions for transfers of personal data to the UK; and
  • the Council of the EU's announcement of an agreed negotiating mandate for the ePrivacy Regulation.

Trading with Europe and Data Protection seminar

On 4 February we ran a Trading with Europe and Data Protection seminar, during which we discussed how the end of the transition period impacts your use of personal data for doing business in Europe, including:  

  • How does Brexit impact the free use of customer and employee data?
  • What are the data protection requirements that are placed on your operations in the UK and EU?
  • What are the options for dealing with those requirements as you grapple with the two regimes of the UK GDPR and EU GDPR?
  • To what extent does data protection create barriers to international trade?
  • What does the future of data protection look like for the UK and the EU, what is on the horizon and how will this benefit or impact your business?

Click here to listen to the recording.  Note that the webinar was recorded before the publication of the European Commission's draft adequacy decision, but the content is still relevant.

This month's top stories

European Commission publishes draft adequacy decisions
On 19 February the European Commission published draft adequacy decisions for transfers of personal data to the United Kingdom under the GDPR and the Law Enforcement Directive.  If these draft decisions are adopted, this will allow the transfer of personal data from the EEA to the UK to continue with no requirement for additional safeguards, such as standard contractual clauses, when 'the bridge' contained in the Trade and Cooperation Agreement expires.

The draft decisions will be reviewed by the EU Data Protection Authorities at the European Data Protection Board (EDPB) which will issue a non-binding opinion, before being presented to EU Member States for formal approval.  While the draft decisions (provided that they are adopted) are extremely welcome and will make data transfers between the EEA and the UK work much more smoothly, there are a few points to note:

  • The Decisions will expire four years after they come into effect, unless renewed.  This is because the Commission notes that, as the UK has left the EU, its data protection law may start to diverge from EU law.  The Decisions will therefore be reviewed after four years.
  • The Decisions emphasise that the UK's continued adherence to the European Convention on Human Rights is a 'particularly important element of the assessment on which the decisions are based'.
  • A number of commentators have sounded a note of caution about the possibility that the decisions (if finalised) could subsequently be declared invalid, like the Safe Harbor and Privacy Shield.  The 'GDPR today' newsletter from Max Schrems' NOYB privacy organisation contains the following reaction: 'don’t rule out a legal challenge that could bring a decision crashing down'. Organisations should bear this risk in mind, and be prepared to put safeguards in place, possibly at short notice, if this becomes necessary in the future.  An alternative approach is to consider binding corporate rules (BCRs) for intra-group transfers.

If you would like advice about any aspect of cross-border data transfers, please contact one of our data protection specialists.

Council of the EU announces agreed negotiating mandate for ePrivacy Regulation
On 10 February the Council of the EU announced that the EU member states have agreed a negotiating mandate for the long-awaited ePrivacy Regulation, which is intended to replace the existing Directive on which the Privacy and Electronic Communications (PECR) are based.  This means that the Portuguese presidency of the Council can start talks with the European Parliament to negotiate the final text.  Once the text is finalised, it will not start to apply in the EU until two years following its publication.  The Regulation will not be binding in the UK, although it will apply where UK organisations process personal data of data subjects in the EU, but it is likely that the UK will update PECR in line with the Regulation.

The key points of the draft Regulation are:

  • It will cover electronic communications content transmitted using publicly available services and networks, and metadata related to the communication.
  • To ensure full protection of privacy rights and to promote a trusted and secure Internet of Things, the rules will also cover machine-to-machine data transmitted via a public network.
  • The rules will apply when end-users are in the EU, irrespective of where the processing takes place or where the service provider is established or located.
  • As a rule, electronic communications data will be confidential. Any interference, including listening to, monitoring and processing of data by anyone other than the end-user will be prohibited, except when permitted by the ePrivacy regulation.
  • Permitted processing of electronic communications data without the consent of the user includes ensuring the integrity of communications services, checking for the presence of malware or viruses, or cases where the service provider is bound by EU or member state law for the prosecution of criminal offences or prevention of threats to public security.
  • Metadata may be processed for billing, or for detecting or stopping fraudulent use. With the user’s consent, service providers could use metadata to display traffic movements to help public authorities and transport operators to develop new infrastructure where it is most needed. Metadata may also be processed to protect users’ vital interests, including for monitoring epidemics and their spread or in humanitarian emergencies.
  • In certain cases, providers of electronic communications networks and services may process metadata for a purpose other than that for which it was collected, provided that this processing is compatible with the initial purpose and strong specific safeguards apply.
  • The use of processing and storage capabilities and the collection of information from the user’s terminal equipment will only be allowed with the user’s consent or for other specific transparent purposes laid down in the regulation.
  • The end-user should have a genuine choice on whether to accept cookies or similar identifiers.
  • To avoid cookie consent fatigue, an end-user will be able to give consent to the use of certain types of cookies by whitelisting one or several providers in their browser settings. Software providers will be encouraged to make it easy for users to set up and amend whitelists on their browsers and withdraw consent at any moment.
  • The text also includes rules on identification of the line from which a call is made, public directories, and unsolicited and direct marketing.

We will monitor the progress of the draft Regulation and provide updates on how it will affect UK organisations in future issues of DWF Data Protection Insights.

Regulatory guidance/campaigns/other news from the Information Commissioner's Office (ICO)/European Data Protection Board (EDPB)/ European Data Protection Supervisor (EDPS)

ICO launches data analytics toolkit
The ICO has launched a toolkit for organisations considering using data analytics on personal data. The toolkit is designed to assist organisations navigate the challenges that AI systems may pose for individual rights. The user needs to answer (anonymously) a series of questions relating to the following themes:

  • lawfulness, accountability and governance;
  • the data protection principles; and 
  • data subject rights,

and is then presented with a report setting out what points the user still needs to consider.  While the report provides useful high-level guidance on what you need to consider, the nature of the questions means that the guidance is not tailored to the nature of the project.  In addition, it needs to be read alongside the ICO's guidance on AI:

For bespoke, in-depth advice on the use of AI/data analytics, you can contact one of our data protection specialists.

Information Commissioner looks ahead to 2021
The Information Commissioner has published a blogpost setting out the ICO's priorities for 2021.  The key points are:

  • The ICO’s immediate focus remains supporting organisations through the impacts of COVID 19.
  • The Age Appropriate Design Code (also known as the Children's Code) will start to have a real impact, as the transition period around its introduction comes to an end (on 2 September 2021), and the ICO will be working hard to support organisations to make the necessary changes to comply with the law.
  • The ICO will also be focused on supporting organisations around data sharing, following the publication of its Data Sharing Code of Practice in December 2020.  See the December 2020 issue of DWF Data Protection Insights for our overview of the Code's key points.
  • Other support for organisations planned for this year includes guidance on political campaigning, facial recognition, and codes of conduct and certification schemes.
  • The ICO's operational work will also continue, including the latest phases of its work looking at data broking, the use of sexual crime victims’ personal information, and adtech, including audits focused on data management platforms.

Safer Internet Day 2021
On 9 February the ICO published a statement and video message from the Information Commissioner in which she expressed support for Safer Internet Day 2021 and emphasised the importance of the Children's Code.

ICO Regulatory Sandbox Final Report: Novartis Pharmaceuticals UK Ltd
On 5 February the ICO published its report on its Regulatory Sandbox project with Novartis to support Novartis' development of a voice-enabled web portal which was initially intended to allow patients to fill in health questionnaires from home, but was expanded to offer more immediate support to the NHS and its patients during the COVID-19 crisis and beyond.  The key data protection issues identified were:

  • Identifying the parties' respective roles: who is a controller, processor or joint controller?  This is not always clear, but the report provides some useful guidance on how Novartis and the ICO approached the question for this specific project.
  • Processing voice/speech data, which may constitute biometric or special category data.  This depends on the purpose for which the data is processed.
  • Identifying whether the processing involved automated decision making without human intervention and whether the effect on the individual would be significant.
  • How best to communicate the required transparency information to patients.

If your organisation uses or is considering whether to use voice-enabled technology, the ICO's report provides some useful guidance, but our data protection specialists would be happy to provide bespoke support and advice.

EDPS Opinions on the Digital Services Act and the Digital Markets Act
On 10 February the European Data Protection Supervisor (EDPS) published opinions on the Digital Services Act and the Digital Markets Act.

In the EDPS's Opinion on the Digital Services Act, he welcomes the proposal, which seeks to promote a transparent and safe online environment.  The EDPS recommends additional measures to better protect individuals in relation to content moderation, online targeted advertising and recommender systems used by online platforms, such as social media and marketplaces.

In brief, the purpose of the Digital Services Act (which is in fact a Regulation) is to govern intermediary services provided to service recipients established or resident in an EU member state, irrespective of where the service provider is established.  

In relation to online advertising the Act provides:

  • All online platforms that display advertising must ensure that for each advert service recipients can identify it as an advert, see on whose behalf it is displayed and understand what parameters have been used to select them as an audience for the advert.
  • Very large online platforms must publish information about adverts they display, including on whose behalf they are displayed, whether they are targeted and the parameters used for targeting and the numbers of users to whom they are displayed.
  • The Regulation envisages the development of voluntary standards for the interoperability of the advert repositories maintained by very large platforms and for the transmission of data between advertising intermediaries, as well as codes of conduct in relation to this.

In his Opinion on the Digital Markets Act (DMA), the EDPS welcomes the European Commission’s proposal that seeks to promote fair and open digital markets and the fair processing of personal data by regulating large online platforms acting as gatekeepers, but makes recommendations to help ensure that the DMA effectively complements the GDPR, including making consent management easy to use, clarifying the scope of the data portability obligation and ensuring that it is compatible with GDPR, paying more attention to the need for effective anonymization and re-identification tests when sharing data, introducing minimum interoperability requirements for gatekeepers, promoting the development of technical standards and clarifying the role of the EDPB and the relevant authorities.

The DMA introduces new rules for certain core platforms services acting as 'gatekeepers' (including search engines, social networks, messaging services, operating systems and online intermediation services) in the digital sector and aims to prevent them from imposing unfair conditions on businesses and consumers and to ensure the openness of important digital services.

Enforcement action

ICO enforcement
The ICO has continued to focus its enforcement action on breaches of the Privacy and Electronic Communications Regulations (PECR), fining six organisations a total of £750,000 for making marketing calls to individuals who had registered with the Telephone Preference Service (TPS), which is a breach of PECR.  The ICO also:

  • criticised the organisations involved for buying marketing lists without conducting any due diligence on the sellers; and
  • reminded companies carrying out electronic marketing that to comply with the law they should subscribe to the TPS to receive the register of subscribers to screen against their own call lists.

These fines serve as a reminder of the importance of ensuring that your direct marketing complies with direct marketing law, including PECR.  In particular, you need to take a very cautious approach when buying a marketing list.  The ICO Direct Marketing Guidance (note that this has not yet been updated for GDPR, but the sections on PECR are still current) contains guidance on buying a marketing list, including:

  • If you are buying a ‘consented’ marketing list, the consent request must have identified you specifically. Even precisely defined categories will not be enough to give you valid informed consent under the GDPR definition.
  • You must keep records to demonstrate what the individual has consented to, including what they were told, and when and how they consented.
  • If you buy personal data from another organisation, you must provide people with your own transparency information detailing anything that they haven’t already been told.
  • Organisations buying or renting a marketing list from a list broker or other third party must make rigorous checks to satisfy themselves that the third party obtained the personal data fairly and lawfully, that the individuals understood their details would be passed on for marketing purposes, and that they have the necessary consent.
  • Organisations should take extra care if using a bought-in list to send marketing texts, emails or automated calls. They must have very specific consent for this type of marketing, and in most cases indirect consent (i.e. consent originally given to another organisation) will not be enough. 
  • Remember also that the 'soft opt-in' exception for email or text marketing cannot apply to contacts on a bought-in list.
  • Data protection law requires that any personal information held should be adequate, relevant and not excessive, and that it should not be kept for longer than necessary. Organisations buying a list should decide how much of the information they need to keep. Any unnecessary personal information should be deleted.

If you are unsure whether your direct marketing activities comply with the law, or you are considering whether to undertake additional activities, our data protection specialists would be happy to review your activities or plans and advise on how best to maximise your marketing opportunities while complying with the law.

Leave.EU and Eldon Insurance Services v Information Commissioner
On the subject of direct marketing, the Upper Tribunal has confirmed the decisions of the ICO and the First-tier Tribunal that the Leave.EU campaign and Eldon Insurance Services (which belong to the same group of companies) breached PECR by including marketing material for Eldon in political emails sent to recipients who had consented to receive political emails from Leave.EU, but not marketing emails.  This provides a reminder that if you send emails or texts to your customers for other purposes, e.g. to provide contract renewal information, you must not include marketing material unless the recipient has consented or you can rely on the soft opt-in under PECR. It has been reported that the companies will appeal to a higher court, so it will be interesting to see how the court analyses and applies PECR.

It's worth flagging that some of the marketing material consisted of promotional banners featuring Eldon's 'Skippy' kangaroo character contained in political emails, rather than the content of the body of the emails.

If you would like advice on whether your marketing communications comply with PECR and other data protection law in the light of this decision, please contact one of our data protection specialists.

Industry news

Government launches a new UK Cyber Security Council
The UK government has announced that it has set up a new Cyber Security Council to be the official governing body on training and standards for the cyber security sector.  The Council is to be funded by DCMS and the announcement states that it will:

  • provide a single governing voice for the industry to establish the knowledge, skills and experience required for a range of cyber security jobs, bringing it in line with other professions such as law, medicine and engineering;
  • boost skilled job prospects around the country by giving budding and existing workers a clear roadmap for building a career in cyber security and focus on increasing the number and diversity of people entering the profession; and
  • work with training providers to accredit courses and qualifications, and give employers the information and confidence they need to recruit effectively to ensure their cyber capability.

Cybersecurity to the Rescue: Pseudonymisation for Personal Data Protection
On 28 January the European Union Agency for Cybersecurity (ENISA) published a report on pseudonymisation and personal data protection.  The key points are:

  • Each case of personal data processing needs to be analysed to determine the most suitable technical option in relation to pseudonymisation;
  • Continuous analysis of state-of-the-art in the field of data pseudonymisation is needed, as new research and business models break new ground;
  • Developing advanced pseudonymisation scenarios for more complex cases, for example when the risks of personal data processing are deemed to be high; and
  • The broader adoption of data pseudonymisation at EU and Member States levels.

While the UK has now left the EU, the principle of data minimisation, of which pseudonymisation is one aspect, remains enshrined in the UK GDPR.  If you require advice on any aspect of data minimisation, including pseudonymisation, please contact one of our data protection specialists.

Council of Europe issues guidelines on facial recognition
The Council of Europe has published guidelines on facial recognition.  While these are not binding, they provide useful guidance on how to comply with data protection law, which in the UK remains almost identical to EU law.  While previous guidance on facial recognition has focused on its use by public sector bodies, this guidance is of interest because it considers its use in each of the public and private sectors.  The key principles are:

  • There must be a lawful basis for the use of facial recognition technologies, and their necessity must be assessed together with the proportionality to the purpose and the impact on the rights of the data subjects.
  • Public authorities should not, as a rule, rely on consent due to the imbalance of power.  The guidelines then distinguish between law enforcement authorities and other public authorities.
  • In the private sector, the explicit, specific, free and informed consent of data subjects is required. Accordingly, the use of facial recognition technologies can only take place in controlled environments for verification or for authentication or for categorisation purposes. In order to ensure that consent is freely given, data subjects should be offered alternative solutions to the use of facial recognition technologies (for example, using a password or an identification badge) that are easy to use.
  • If consent is given for a specific purpose, personal data should not be processed in a way that is incompatible with this purpose. Disclosure of data to a third party should also be subject to specific consent.
  • Supervisory authorities (the EU equivalent of the ICO) should be consulted on proposals for any legislative or administrative measures implying the processing of personal data by facial recognition technologies.
  • Legislators and decision-makers should use mechanisms including certification to ensure the accountability of the developers, manufacturers, service providers or entities using these technologies.
  • The awareness of data subjects and the understanding by the general public of facial recognition technologies and of their impact on fundamental rights should be actively supported through accessible and educational actions.
  • Developers, manufacturers and service providers should follow guidelines in relation to data and algorithms quality, reliability of tools, awareness and accountability.
  • Organisations which use facial recognition technologies should follow guidelines in relation to transparency and fairness, purpose limitation, data minimisation and limited duration of storage, accuracy, data security, accountability and ethics.

If you are using or considering whether to use facial recognition technology, a data protection impact assessment (DPIA) is highly likely to be necessary, to identify and mitigate the risks.  Our data protection specialists can support you with this DPIA and advise on how to mitigate any risks identified, to ensure that your use of the technology complies with the law.  While to date the ICO's investigation into the use of facial recognition has focused on the police, these guidelines reflect increased scrutiny of its use in the public sector.

NIS2 Directive
On 22 February the European Parliamentary Research Service (EPRS) announced that the NIS2 Directive is under deliberation. This is intended to expand the Directive on Security Network and Information Systems, on which the UK Network and Information Systems (NIS) Regulations 2018 are based. In particular, the NIS2 Directive aims to create a high common level of cybersecurity across the EU Member States, respond to growing threats posed with digitalisation and the surge in cyber attacks, strengthen the security requirements, introduce stricter supervisory and enforcement measures, as well as increase the scope of entities covered by the original NIS Directive. 

While the Directive will not be binding on the UK, it is possible that the UK may update the NIS Regulations in line with it.  We will monitor the draft Directive's progress and report on how developments may affect UK organisations in future issues of DWF Data Protection Insights.

Post-Brexit transition

This month's post-Brexit news (the draft adequacy decision for the UK) is so important that we've included it under the heading 'This month's top stories' near the start of this issue.

If you require any further information, please contact Sam Morrow or JP Buckley.

Further Reading