• SP
Choose your location?
  • Global Global
  • Australia
  • France
  • Germany
  • Ireland
  • Italy
  • Poland
  • Qatar
  • Spain
  • UAE
  • UK

DWF Data Protection Insights April 2022

03 May 2022

Here is our round-up of the top data protection stories, together with practical advice on how to address the legal issues raised. 

Regulatory guidance/campaigns/other news from the Information Commissioner's Office (ICO)/European Data Protection Board (EDPB)/ European Data Protection Supervisor (EDPS)


ICO guidance

Anonymisation guidance part 4: Accountability and governance

The ICO has published a consultation draft of part 4 of its anonymisation guidance, focusing on the accountability and governance measures needed for producing and disclosing anonymous information. The key points are:

  • Take a comprehensive approach to governance.
  • Be clear about processes, responsibilities and oversight.
  • Use a DPIA (Data Protection Impact Assessment) to help you structure and document your decision-making processes and identify risks.
  • Be clear about how and why you intend to anonymise.
  • Work with other organisations likely to be processing and disclosing other information that could impact the effectiveness of your anonymisation.
  • Consider how different forms of anonymous information can pose different identifiability risks and choose an appropriate release model to mitigate those risks.
  • Plan for cases where it may be difficult to assess identifiability risk and implement appropriate risk mitigation measures.
  • Demonstrating transparency when processing anonymous information promotes public trust.
  • Ensure decision-makers understand the latest technological and legal developments and best practices to ensure effective anonymisation.
  • Consider any other legal considerations (e.g. the Freedom of Information Act 2000, human rights law and the duty of confidentiality) that may be relevant to your anonymisation processes and decision-making.

Data protection and Coronavirus-19 – relaxation of government measures

The ICO has published the above guidance, which sets out some key points organisations should consider around using personal information to keep their staff and customers safe during the pandemic. The main points are:

  • Is it still necessary? Think about:
    • How will collecting extra personal information help keep your workplace safe?
    • Do you still need the information previously collected?
    • Could you achieve your desired result without collecting personal information?

You should review your approach and ensure that it is still reasonable, fair and proportionate to the current circumstances, taking the latest government guidance into account.

  • Retaining information collected during the COVID-19 pandemic. You should assess any additional information which you collected and retained during the pandemic and ensure that you securely dispose any information that is no longer required.
  • Can we still collect vaccination information?
  • If you are continuing to collect vaccination information, you must be clear about what you are trying to achieve and how asking people for their vaccination status helps to achieve this. Your use of this data must be fairrelevant and necessary for a specific purpose.
  • This is health data, so under the UK GDPR you must have a lawful basis and an Article 9 condition for processing special category data.
  • You should check the applicable government guidance.
  • As well as data protection law, you should also consider employment law and your contracts with your employees, health and safety requirements and human rights law, including privacy rights.

If you would like advice on this issue, or any other aspect of privacy law, please contact one of our specialist data protection lawyers. If necessary, we can work with our employment and regulatory experts to help you to comply with all aspects of the law.

Best interests of the child self-assessment

The ICO has made available a Best interests of the child self-assessment | ICO. This is a tool to enable you to assess children's best interests as part of your DPIA process. It comprises the following sections:

  • Overview – this provides a brief introduction to the ICO's Children's Code.
  • Step 1: understanding rights - this provides guidance on the United Nations Convention on the Rights of the Child (UNCRC).
  • Step 2: identify impacts. This addresses how the following online activities may impact children's rights:
    • Age assurance
    • Data sharing between users
    • Data sharing with a third party organisation
    • Geolocation tracking
    • Online complaint and request tools
    • Connected toys and devices
    • Parental controls
    • Profiling and service personalisation
    • Profiling for automated decision-making
    • Privacy information, policies and community standards
    • Privacy and data use settings
  • Step 3: assess impacts – this provides resources and guidance on carrying out a risk assessment.
  • Step 4: prioritise actions – for each activity set out in step 2, this step:
    • Recommends actions that organisations can take; and
    • Provides guidance on translating those actions into practice.

Remember that the Children's Code applies to online services, such as apps, online games, and web and social media sites, likely to be accessed by children, not just those targeted at children. If you would like advice on how to comply with the Code, please contact one of our privacy specialists.

Enforcement action

ICO enforcement

The ICO has continued to impose a number of fines for breaches of the Privacy and Electronic Communications Regulations 2003 (PECR), including:

  • A fine of £80,000 for sending direct marketing text messages which sought to capitalise on the pandemic by referring to lockdown.
  • A fine of £40,000 for sending a marketing email to the organisation's whole database, including those who had not consented to receive marketing. This was due to an error which incorrectly categorised the email as a transactional message rather than marketing.
  • A fine of £60,000 for sending marketing texts and emails without adequate consent and failing to provide an opt-out. The organisation claimed that it had obtained consent, but the ICO held that it was insufficient, because they did not provide a separate opt-in or opt-out box to enable applicants either to consent or withdraw consent specifically in relation to direct marketing messages.
  • A fine of £30,000 for sending unsolicited direct marketing text messages without valid consent. While the organisation claimed that it had obtained consent, the ICO held that the consent was not freely given, specific and informed, as required by the GDPR definition of consent, which also applies to PECR.

These fines indicate that the ICO is continuing to focus its enforcement action on PECR breaches, so if you would like any advice on conducting direct marketing campaigns in compliance with the law, please contact one of our data protection specialists.

EU enforcement action

Belgian data protection regulator fines airports for unlawful monitoring of airport passengers for Covid

Click here to read an article about these fines by Stewart Room, DWF's Global Head of Data Protection and Cyber Security: Legal Consequences For Covid Monitoring Emerging (forbes.com)

Industry news

DCMS publishes Cyber Security Breaches Survey 2022

The Department for Digital, Culture, Media and Sport (DCMS) has published the Cyber Security Breaches Survey 2022. Key findings include:

  • Cyber attacks are becoming more frequent with organisations reporting more breaches over the last 12 months;
  • Two in five businesses use a managed IT provider but only 13% review the security risks posed by their immediate suppliers; and
  • 30% of charities and 39% of businesses reported cyber security breaches or attacks in the last 12 months.

The DCMS press release (see the link above) recommends resources for large businesses, small businesses and charities to improve cyber security practices.

On Tuesday 24 May, 15:00 – 16:00 GMT we will be hosting our next Tech and Data Leaders Forum webinar: Security State of the Nation – Updates on Critical Legal Developments, Threats and Risks. During the session our UK Data Protection and Cyber Security experts will discuss, amongst others, the key points arising from the DCMS survey. If you are interested in joining us on this occasion, please register here

European Parliament and Council reach provisional agreement on Digital Markets Act

The EU institutions have announced that the Parliament and Council have reached provisional agreement on the Digital Markets Act (DMA). The DMA's key points are:

  • It imposes obligations on "gatekeepers", meaning a platform that has had an annual turnover of at least EUR7.5 billion within the EU in the past three years or has a market valuation of at least EUR75 billion, and it also has at least 45 million monthly end users and at least 10,000 business users in the EU. It must also control one or more core platform services (e.g. browsers, messengers or social media) in at least three member states.
  • Small and Medium-sized Enterprises (SMEs) are exempt from being gatekeepers, apart from in exceptional cases. However, there is a category of "emerging gatekeeper", so the Commission can impose obligations on companies whose competitive position is proven but not yet sustainable.
  • Gatekeepers will have to make sure that users have the right to unsubscribe from core platform services, ensure interoperability and provide access to marketing and advertising data.
  • Gatekeepers will not be able to engage in self-preferencing (favouring their own services), reuse certain private data, establish unfair conditions for business users, pre-install certain software or require app developers to use certain services.
  • Penalties include fines of up to 10% of a gatekeeper's total worldwide turnover, rising to 20% for repeat offences. Persistent breaches can result in investigation, potentially leading to behavioural or structural remedies, including a merger ban.
  • The Commission is sole enforcer, but it can engage in regulatory dialogue. An advisory committee and a high-level group will be set up. Member states will be able to empower national competition authorities to start investigations and transmit their findings to the Commission. 

While the DMA will not form part of UK law, it will apply to any UK organisations that fall within the definition of "gatekeeper".

How to deal with Legacy IT

Legacy IT refers to an organisation's IT infrastructure and systems, their component software and hardware, and related business processes. It tends to be characterised by "old" applications, but ageing, poorly-maintained and out-of-control infrastructure are equally part of the issue. It is a multi-billion pound problem for society and our government.

Legacy IT can impact cyber and national security, operational resilience and continuous improvement, digital transformation and value for money.

Organisations adopting legacy data protection approaches put their data at risk and fails to leverage the capabilities and advantages of today's modern storage system. Technology has rapidly advanced. Traditional storage and data protection technologies and backup data protection technologies were not designed to cope with today’s more demanding environments. Downtime is more than just lost revenue, it is loss of customer confidence, damage to an organisation’s brand and reputation, and loss of employee confidence which can have huge impacts on business operations and the ability to remain competitive.

To prevent legacy IT becoming more problematic, organisations need to always make sure their technology is up-to-date, there are clear and flexible requirements built into contracts upfront, and that contracts are managed throughout their timeline with preventing legacy IT in mind. Organisations should also ensure that obligations are placed on suppliers. Suppliers should use risk management and report on the status of current and possible future legacy IT which fall under their management.

For advice on these issues, please contact one of our data protection specialists.

Authors: Sam Morrow and Julia Layton

Further Reading