• SP
Choose your location?
  • Global Global
  • Australia
  • France
  • Germany
  • Ireland
  • Italy
  • Poland
  • Qatar
  • Spain
  • UAE
  • UK

DWF Data Protection Insights November/December 2021

14 December 2021

Here is our round-up of the top data protection stories, together with practical advice on how to address the legal issues raised.  Read our review.

This month's highlights include:

  • the ICO's opinion on adtech;
  • the EDPB guidance on international transfers; and
  • DCMS's National Data Strategy Mission 1 Policy Framework: Unlocking the value of data across the economy.

Book publication and webinar news

Book publication: Data Protection and Compliance

While we will publish our annual data protection new year's resolutions in January, we do have one Christmas present recommendation for you.  We are pleased to announce the publication of Data Protection and Compliance edited by Stewart Room, DWF's Global Head of Data Protection, Privacy and Cyber Security, with contributions from other members of the team.  You can order your copy here.

Webinar on demand: Lloyd v Google – implications of Supreme Court judgment for data claims

On 12 November Stewart and other DWF data protection experts presented this webinar about the recent Supreme Court decision in Lloyd v Google, covering:

  • the issues that arose in the appeal and where the judgment leaves us;
  • the ability for claimants to recover loss of control damages, user damages and compensation for distress; and
  • how the judgment connects to other recent landmark developments affecting the tort of misuse of private information and breach of confidence and the rules about insurance cover for litigation costs.

You can watch a recording of the webinar here.

Podcast: Cybersecurity Due Diligence in the M&A Process
In this podcast we discuss cybersecurity due diligence in the M&A process. The conversation centres around key issues and trends impacting firms pre- and post-deal, including tangible data protection risk mitigation advice for business leaders. Click here to listen to the podcast.

Regulatory guidance/campaigns/other news from the Information Commissioner's Office (ICO)/European Data Protection Board (EDPB)/ European Data Protection Supervisor (EDPS)

ICO guidance

Adtech: Information Commissioner's Opinion on data protection and privacy expectations

The Information Commissioner's Office (ICO) has published a Commissioner's Opinion calling on adtech companies to eliminate the privacy risks posed by the online advertising industry.  Click here to read our summary of the key points.

The ICO Children's Code: focus on age assurance

The ICO has published an opinion and a call for evidence on the use of age assurance in the context of its Children's Code.  Read our overview of the key points here.

DWF Data Protection Insights authors Sam Morrow and JP Buckley are members of Lexis PSL's Data Protection Intelligence Group (DPIG) and have contributed to the DPIG's work on creating age-appropriate privacy notices for children.  Please contact JP (contact details below) if you need advice on age assurance or children's privacy notices.

ICO's Data Sharing Code of Practice comes into force

In the December 2020 issue of DWF Data Protection Insights we reported on the publication of the ICO's Data Sharing Code of Practice.  The code came into force on 5 October 2021.  Please contact one of our data protection specialists if you would like us to review or advise on any of your data sharing arrangements.

Joint statement on global privacy expectations of Video Teleconferencing (VTC) companies

On 27 October the ICO published a joint statement and observations jointly with the data protection and privacy authorities from Australia, Canada, Gibraltar, Hong Kong SAR, China and Switzerland.  The statement follows a letter signed by those parties in July 2020 which:

  • highlighted their concerns about whether privacy safeguards were keeping pace with the rapid increase in use of VTC services during the global pandemic;
  • provided VTC companies with some guiding principles to address key privacy risks; and
  • invited five of the largest VTC companies to respond.

The statement reports that, following a review of the responses, the data protection authorities engaged in a dialogue with the VTC companies, and have now published their observations.  While these have been published in the context of VTC services, they set out good data governance practice relevant to a wide variety of services. The observations emphasise the importance of:

  1. Security:
  • regular testing; and
  • ensuring that employees and third-party subprocessors understand their responsibilities around accessing and handling personal data;
  1. Privacy by design and by default:
  • privacy programmes; and
  • default settings being the most privacy protective by default;
  1. Know your audience:
  • when VTC services are used in contexts where discussions are particularly sensitive, e.g. education and healthcare, providers must ensure robust privacy and security safeguards; and
  • guidance – tailored privacy and security guidance for specific groups is good practice to help ensure users are confident using a VTC service and selecting the settings and features most appropriate for them;
  1. Transparency:
  • Layered privacy notices, including:
    • detailed privacy notices and dashboards delineating different categories of personal information collected;
    • privacy check-up features;
    • contextual notices in advance of video calls; and
    • pop-up written or audible notifications during calls to indicate data collection;
  • users of VTC services must be clearly informed about who their information will be shared with and why, and given six months' notice before a new processor is used;
  • VTC companies should publish transparency reports regarding law enforcement and government requests for access to data; and
  1. End-user control:
  • meeting controls – users should be given intuitive and clear controls and be alerted to the information about them that is collected; and
  • risk management – VTC companies should alert users to the risk of compromising the privacy and security of other meeting participants by making the meeting information publicly available e.g. via social media posts.

EDPB guidance: international transfers

The European Data Protection Board has published guidance on the interplay between GDPR's territorial scope and international transfers. Click here to read our overview of the key points.

Enforcement action

ICO enforcement

Cabinet Office fined £500,000 for New Year Honours data breach

The ICO has fined the Cabinet Office £500,000 for disclosing the postal addresses of the 2020 New Year Honours recipients online. The breach was caused by the Cabinet Office setting up the IT system incorrectly, and then, due to tight timescales, amending the file instead of modifying the system.  At the time of the breach, there was no specific or written process in place to sign off documents and content containing personal data prior to being sent for publication.  While the Cabinet Office put such measures in place following the breach, the fine provides a reminder of the importance of having a process for dealing with the publication or sharing of files that contain or may contain personal data, to ensure that only the intended data is published.

ICO announces provisional intention to fine facial recognition provider over £17 million

Following a joint investigation with the Office of the Australian Information Commissioner, the ICO has announced its intention to fine a US company just over £17 million for breach of UK data protection laws.  The company provides a facial recognition search tool allowing users (including overseas law enforcement agencies) to search faces against a database of over 10 billion images.  The ICO noted that the database was likely "to include the data of a substantial number of people from the UK and may have been gathered without people’s knowledge from publicly available information online, including social media platforms." 

The ICO’s preliminary view is that the facial recognition provider appears to have failed to comply with UK data protection laws in several ways including by:

  • failing to process the information of people in the UK in a way they are likely to expect or that is fair;
  • failing to have a process in place to stop the data being retained indefinitely;
  • failing to have a lawful reason for collecting the information;
  • failing to meet the higher data protection standards required for biometric data (classed as special category data under the GDPR and UK GDPR);
  • failing to inform people in the UK about what is happening to their data; and
  • asking for additional personal information, including photos, which may have acted as a disincentive to individuals who wish to object to their data being processed.

The company now has the opportunity to make representations against the provisional intention with the ICO expected to make a final decision by mid-2022.

ICO issues £140,000 fine to tackle illegal pension cold calls

The ICO has fined an organisation £140,000 for instigating over 107,000 illegal cold calls to people about pensions.  The company had asked lead generators to make calls on its behalf and paid up to £750 for the referrals. The ICO found that the company did not have the valid consent - freely given, specific and informed – to instigate the making of these calls, and  concluded that it contracted the lead generators to make the calls in order to try to bypass the law banning pension cold calls, which came into force in 2019.

The ICO's decision makes it clear that engaging third parties to make marketing calls does not absolve an organisation of responsibility for instigating those calls.

Charity fined for GDPR breach

The ICO has fined a charity £10,000 for sending an email to 105 people, with their email addresses being visible to the other recipients.  The ICO has reminded organisations to review their bulk email practices, highlighting the importance of:

  • adequate staff training, which staff must undertake before handling personal data;
  • using a secure method of sending bulk emails, instead of using the less secure bcc function; and
  • an adequate data protection policy.

DWF's data protection specialists can help you put these measures in place, so please contact us if you would like our support.

Union fined for PECR breach

The ICO has fined a union £45,000 for making unsolicited direct marketing calls to promote life insurance to individuals registered with the TPS (Telephone Preference Service), in breach of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR).  While the union considered that it had obtained the individuals' consent, the ICO was not satisfied that the consent was freely given, specific and informed, as required by the GDPR/UK GDPR. 

This action provides useful reminders of:

  • the importance of screening your telephone list against the TPS; and
  • the need to provide specific information to individuals about how you intend to use their personal data, to ensure that they know what they are consenting to.

Please contact one of our data protection specialists for advice on how to ensure that your direct marketing campaigns comply with data protection law, including the GDPR/UK GDPR and PECR.

Industry news

DCMS publishes National Data Strategy Mission 1 Policy Framework: Unlocking the value of data across the economy

DCMS has published a policy paper National Data Strategy Mission 1 Policy Framework: Unlocking the value of data across the economy. Click here to read our summary of the key points.

UK government publishes public sector algorithmic transparency standard

The Central Digital and Data Office has published an algorithmic transparency standard to help public sector organisations provide clear information about algorithmic tools they use to support decisions. Click here to read our summary of the key points.

DCMS publishes Product Security and Telecommunications Infrastructure Bill

The government has published its Product Security and Telecommunications Infrastructure Bill, which creates a new regulatory scheme to make consumer connectable products more secure against cyber attacks.  Click here to read our summary of the key points. 

AG opinion: GDPR does not prevent consumer protection associations from bringing legal proceedings for infringements of data protection rights

An advocate general (AG) of the Court of Justice of the EU (CJEU) has provided his opinion that:

  • Article 80(2) of the EU GDPR should be interpreted as meaning that it does not preclude national legislation which allows consumer protection associations to bring legal proceedings against the person alleged to be responsible for an infringement of the protection of personal data, provided that the objective of the representative action in question is to ensure observance of the rights which the persons affected by the contested processing derive directly from the EU GDPR; and
  • this is the position even where the action is raised independently of an actual infringement of an individual's rights and without a mandate from them.

This opinion is not binding on the CJEU, but is likely to be followed.  Once the Court makes its decision, it is likely to have an impact in the UK because:

  • UK organisations that process the personal data of individuals in the EEA are caught by the territorial scope of the EU GDPR; and
  • In order to maintain the EU-UK adequacy decision, the UK data protection regime needs to continue to mirror the EU regime.

Following the recent Supreme Court decision in Lloyd v Google (see above for the link to our webinar discussing the case) lawyers and other organisations with an interest in data protection are paying close attention to the rules on representative actions for breaches of data protection law.  We will report on the CJEU's decision in a future issue of DWF Data Protection Insights. 

G7 Trade Ministers' Digital Trade Principles

On 22 October the UK government published the Digital Trade Principles agreed by the G7 countries at the G7 Trade Track.  These set out a commitment to:

  • open digital markets
  • data free flow with trust
  • safeguards for workers, consumers, and businesses
  • digital trading systems
  • fair and inclusive global governance

The announcement explains that data free flow with trust comprises the following principles and concerns:

  • data should be able to flow freely across borders with the trust of individuals and businesses;
  • concern about situations where data localisation requirements are being used for protectionist and discriminatory purposes, as well as to undermine open societies and democratic values, including freedom of expression;
  • the need to address unjustified obstacles to cross-border data flows, while continuing to address privacy, data protection, intellectual property rights and security;
  • personal data must be protected by high enforceable standards, including when it is transferred across borders – the G7 members will cooperate and promote interoperability between them;
  • non-personal data should benefit from protection, including intellectual property and trade secrets;
  • achieving consensus on common principles for trusted government access to personal data held by the private sector will help to provide transparency and legal certainty; and
  • open government data can play an important role in digital trade. Where appropriate, public sector datasets should be published in anonymised, open, interoperable, and accessible forms.

In relation to the cross-border data flow issue, it's worth noting that of the G7 members, the USA is the only one with which there are significant obstacles to data transfers.  France, Germany and Italy are EU members, and Japan and Canada have adequacy decisions.  However, Canada's decision is limited to commercial organisations, so if the G7 members want to facilitate public sector data sharing, they may seek to address that obstacle.

At this stage, these are high-level principles, so we will monitor developments and report in future issues of DWF Data Protection Insights.

Further Reading