• IE
Choose your location?
  • Global Global
  • Australia
  • France
  • Germany
  • Ireland
  • Italy
  • Poland
  • Qatar
  • Spain
  • UAE
  • UK

DWF Data Protection Insights May 2021

03 June 2021
Here is our round-up of the month's top data protection stories, together with practical advice on how to address the legal issues raised – in the month where the third anniversary of the coming into force GDPR was celebrated.  A whirlwind of development has taken place since then, with this month being no different! 

This month's highlights include:

  • an update on the EU-UK adequacy decisions and standard contractual clauses;
  • news of the ICO's enforcement action against a company for sending direct marketing in breach of the law, where the company unsuccessfully argued that the emails were service messages; and
  • the Dutch supervisory authority's fine imposed on a company for failing to appoint an EU representative.

This month's top stories

European Parliament passes resolution calling on Commission to modify draft EU-UK adequacy decisions
On 21 May the European Parliament passed a resolution calling on the European Commission to modify its draft UK adequacy decisions to bring them in line with ECJ court rulings and address concerns raised by the European Data Protection Board (EDPB).  See the April 2021 issue of DWF Data Protection Insights for our report on the EDPB's opinion.  The concerns raised relate to the safeguarding of onward transfers of EU data from the UK to third countries, exemptions under the UK data protection regime for immigration and national security related purposes, and bulk access to data.  On 25 May the European Court of Human Rights ruled that the UK's mass surveillance regime (under the Regulation of Investigatory Powers Act 2000) breached the European Convention on Human Rights, and on 26 May the UK Court of Appeal decided that the immigration exemption breaches GDPR.

The Commission will now seek approval of the draft adequacy decisions from the EU member states' representatives, who will vote on a qualified majority basis, meaning that 55% of EU countries representing at least 65% of the total EU population would need to vote against approval in order to block the decisions.  We are monitoring developments closely and will report on any significant developments.

Given the ongoing uncertainty, we recommend that organisations should continue to plan how to deal with data transfers from the EEA to the UK if the decisions are not adopted, or if they are subsequently invalidated.
UK SCCs to go out for consultation in summer 2021
At the ICO's annual Data Protection Practitioners' Conference on 5 May the Deputy Commissioner confirmed that the ICO is working on bespoke UK standard contractual clauses (SCCs) for international data transfers and the draft will be published for consultation in the summer.  He also stated that the ICO is considering recognising transfer tools from other countries, such as the EU SCCs.

Regulatory guidance / campaigns / other news from the Information Commissioner's Office (ICO)/European Data Protection Board (EDPB)/ European Data Protection Supervisor (EDPS)

EDPB guidance and news

On 20 May the EDPB published the following:

  • Opinions that the first draft decisions on transnational Codes of Conduct (Codes) presented to the Board by the Belgian and French supervisory authorities (SAs) comply with GDPR.  The Belgian SA's draft decision concerns the EU CLOUD Code, addressed to cloud service providers, and the French SA's draft decision concerns the CISPE Code, addressed to cloud infrastructure service providers.  The Codes aim to provide practical guidance and set out specific requirements for processors in the EU, but they are not to be used in the context of international transfers of personal data.
  • A statement on the Data Governance Act (DGA), which is a follow-up to the joint EDPB-EDPS opinion on the DGA - see the March 2021 issue of DWF Data Protection Insights for our report on that opinion.  The statement reiterates the importance of ensuring that the DGA is consistent with existing EU law, including the GDPR.
  • Recommendations on the legal basis for the storage of credit card data for the sole purpose of facilitating further online transactions. The recommendations state that consent in accordance with the GPDR should be considered the sole appropriate legal basis for storing credit card data after a purchase is made.
Data sharing code of practice laid before Parliament
In the December 2020 issue of DWF Data Protection Insights we reported that the ICO had published its data sharing code of practice.  On 18 May the ICO reported that the code had been laid before Parliament and it will come into force 40 sitting days later, unless any objections are raised.

If you would like any advice on managing your data sharing arrangements, including drafting or reviewing a data sharing agreement, please contact one of our specialist data protection lawyers.
CMA-ICO joint statement on competition and data protection law

On 19 May the Competition and Markets Authority (CMA) and the ICO published a joint statement on the relationship between competition and data protection in the digital economy.  The statement indicates that https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2021/05/ico-and-cma-set-out-blueprint-for-cooperation-in-digital-markets/ will be a key focus, including competition and user experience in digital advertising markets and an investigation into both the data protection and competition aspects of real time bidding. The report identifies three key categories of synergy between data protection and competition:

  • user choice and control - where users have a genuine choice over the service or product they prefer, providers compete on an equal footing to attract their custom, so are less likely to use 'take it or leave it' terms regarding the use of personal data;
  • standards and regulations to protect privacy - data protection law and competition law must complement each other in respect of achieving efficient market outcomes that involve processing personal data; and
  • data-related interventions to promote competition - interventions to provide or restrict access to data (including personal data) can be an important tool in promoting competition in digital markets.

As well as these synergies, the statement also recognises two areas of tension between data protection and competition:

  • interventions that seek to overcome barriers to competition by providing third parties with access to personal data; and
  • where data protection requirements may be interpreted by industry in a way that risks distorting competition, e.g. with the potential effect of unduly favouring the business models of large, integrated platforms over smaller, non-integrated suppliers.

As well as data protection expertise, our team has expertise in competition law, so please us know if you need advice on the impact of either or both areas of law on your organisation's data processing activities.

ICO and the New Zealand Office of the Privacy Commissioner sign Memorandum of Understanding

On 12 May the ICO announced that it had signed a Memorandum of Understanding (MOU) with the New Zealand Office of the Privacy Commissioner.  The MOU codifies and sets out how the authorities will:

  • continue to share experiences and best practice; 
  • cooperate in specific projects of interest; and 
  • share information or intelligence to support their enforcement work.
ICO blogpost: Spotlight on the Children’s Code standards – data protection impact assessments

On 27 May the ICO published a blogpost about the steps you should take as part of your DPIA to help you assess and mitigate the data protection risks of your service to the rights of children who are likely to access it:

  • Describing the processing of personal data you plan to do, including matters such as the age range of children likely to access the service, plans for any parental controls and the use of any nudge techniques.
  • Consulting with children and parents – the ICO expects larger organisations to do this in most cases. If you consider that it is not possible to do any form of consultation, or it is unnecessary or wholly disproportionate, you should record that decision in your DPIA, and be prepared to justify it.
  • Assessing necessity, proportionality and conformance, including how you conform to each of the standards in the Children’s Code.
  • Assessing how your processing impacts on the best interests of child usersidentify, assess and mitigate risks, such as the potential impact on children and any harm or damage your data processing may cause – whether physical, emotional, developmental or material. If you identify a high risk that you are not mitigating, you must consult the ICO. The ICO is developing guidance to support organisations identify and assess data-related risks to children, building on the beta Children’s Code Harms Framework (see below for more information about the Framework).
ICO blogpost: Applying the Children’s Code harms framework: a gaming sector case study

On 24 May the ICO published a blogpost on its workshop with an international gaming company in which it explored the Children's Code Harms Framework.  The blogpost describes how they worked through the framework:

Step 1: Mapping children’s data journeys

  • How, where and what children’s data do you process?
  • For what purpose are you using this data, and what is the legal basis for this?
  • What age ranges of child users should you consider?
  • Where are the key associated privacy choices and user experiences?

Step 2: Reflecting on risks and children’s rights

Once you've created a data map, you need to identify the risks to the children's rights and freedoms, including the risks of financial harm, bodily harm, developmental harms and unwarranted intrusion, and balance these risks with considerations of how the children's rights are positively supported.

If your organisation provides online services, such as apps, online games, and web and social media sites which are likely to be accessed by children and you want advice on how to comply with the Children's Code, including carrying out a DPIA, please contact one of our data protection specialists.

ICO fines credit card company for sending marketing emails to opted-out customers

The ICO has fined a credit card company £90,000 for sending more than 4 million marketing emails to customers who had opted out from them, in breach of PECR.  The company claimed that the emails were service messages, not marketing emails, but the ICO stated that they were marketing. The emails contained:

  • details of the rewards of shopping online with the credit card; 
  • details of how to get the most out of using the card; and 
  • encouraged customers to download the company's app. 

This enforcement action provides a useful reminder of two key points:

  • ensure that you understand the distinction between service messages and marketing; and
  • keep your database updated to reflect which customers have opted out of marketing and ensure that you do not send direct marketing to those customers.
ICO fines company for sending marketing emails to people who provided their personal data for contact tracing

The ICO has fined a company that provides digital contact tracing services which work through people scanning a QR code when arriving at businesses’ premises for using those people's contact details to send nearly 84,000 nuisance emails, in breach of PECR.  The ICO's report states that it also contacted 16 QR code providers to check that they were handing personal details correctly.  The ICO took the opportunity to remind businesses of the guidelines they need to follow as the UK economy continues to open up:

  • Adopt a data protection by design approach from the start when they develop new products;
  • Make privacy policies clear and simple so that people understand how their information will be handled;
  • Not keep any personal data they have collected for contact tracing longer than stated in the guidelines issued by the relevant public health authority (usually 21 days);
  • Not use the personal data for marketing or any other purpose; and
  • Keep up to date with the guidance on the ICO's data protection and coronavirus information hub.

If you want advice about how to:

  • carry out direct marketing in compliance with data protection law, including PECR; or 
  • collect and process personal details as part of reopening your business premises safely, 

please contact one of our data protection specialists.

Dutch DPA fines non-EU business for failure to appoint an EU representative

On 12 May the Dutch Data Protection Authority announced that it had fined a business based outside the EU for failing to appoint a European representative, as required by the GDPR.  The DPA imposed a fine of €525,000, plus an additional €20,000 per week, up to a maximum of €120,000, if the business continued to fail to comply.

This case provides a reminder that now the UK has left the EU, UK businesses need to appoint a representative in the EEA if:

  • they are a data controller or processor; and
  • they do not have a branch, office or other establishment in any EU/EEA state, but they either:
  • offer goods or services to individuals in the EEA; or
  • monitor the behaviour of individuals in the EEA.

In addition, under the UK GDPR, non-UK organisations need to appoint a UK representative if they do not have an establishment in the UK and they carry out those activities in respect of individuals in the UK.

We discussed this requirement in our Trading with Europe and Data Protection seminar on 4 February - click here to listen to the recording.  If you need advice on whether you need to appoint a representative in the EEA or the UK, please contact one of our data protection specialists.

DCMS announces National Data Strategy Forum

On 18 May DCMS (the Department for Digital, Culture, Media and Sport) published the government's response to the consultation on the National Data Strategy and announced the launch of a National Data Strategy Forum with the stated aim of "helping the country seize the opportunities of data".

The DCMS press release states that the National Data Strategy is an ambitious, pro-growth strategy that is driving the UK forward in building a world-leading data economy that works for everyone, while ensuring public trust in data use and lays out five priority "missions" to be taken to capitalise on the opportunities data offers:

  1. Unlocking the value of data across the economy;
  2. Securing a pro-growth and trusted data regime;
  3. Transforming government’s use of data to drive efficiency and improve public services;
  4. Ensuring the security and resilience of the infrastructure on which data relies; and
  5. Championing the international flow of data.
CDEI releases blog post on the European Commission’s proposed AI regulation

In the April 2021 issue of DWF Data Protection Insights, we reported on the European Commission's proposal for an Artificial Intelligence (AI) Regulation and the CDEI's (the Centre for Data Ethics and Innovation, which is part of DCMS) blogs on AI assurance.

On 11 May the CDEI published a blog post on the Commission's proposal, which:

summarises key elements of the proposed regulation;
notes that, like GDPR, the proposed regulation will affect international organisations;
focuses on how the proposal for AI 'conformity assessments' highlights the need for an ecosystem of effective AI assurance, which gives citizens and businesses confidence that the use of AI technologies conforms to a set of agreed standards and is trustworthy in practice;
highlights the distinction between two types of assurance: 

  • compliance, which aims to test or confirm whether a system, organisation or individual complies with a standard, using audits, certification and verification; and 
  • risk assurance, which asks open-ended questions about how a system works to ensure that the system is trustworthy;

states that the CDEI's upcoming AI assurance roadmap will principally focus on the development of an assurance ecosystem in the UK; but
recognises that it will be important to work with international partners to facilitate the scale-up and interoperability of AI assurance services and approaches across jurisdictions.

As we've reported in several recent issues of DWF Data Protection Insights (see the March 2021 issue, where we reported on DCMS's launch of its national AI strategy), the UK government is scrutinising all aspects of AI.  We will continue to monitor developments and report on them in future issues. 

Call for information on Computer Misuse Act 1990

On 11 May the Home Office published a call for information on the Computer Misuse Act 1990 (CMA).  The National Cyber Security Strategy 2016-21 (NCSS), identified two major categories of cybercrime:

  • Cyber-dependent crimes, such as hacking into computer systems to view, steal or damage data; and
  • Cyber-enabled crimes, which include 'traditional' crimes such as cyber-enabled fraud and data theft.

The CMA is the main UK legislation relating to cyber-dependent crime and, while it has been updated since it came into force, the changes have been limited, meaning that it is now out of date.  The call for information seeks the views of respondents on the following areas:

  • Context – how organisations understand and use the CMA;
  • Offences – whether the offences set out in the CMA are adequate;
  • Protections – whether the protections for legitimate cyber security activity provide adequate cover;
  • Powers – whether law enforcement agencies have adequate powers to tackle cybercrime;
  • Jurisdiction – whether the CMA provides adequate criminalisation of offences carried out against the UK from overseas;
  • Sentences – whether sentences are adequate or whether the sentencing guidelines need to be changed; and
  • International comparisons – examples of legislation in other countries that the UK should consider.

We will monitor developments relating to an updated Computer Misuse Act and report in future issues of DWF Data Protection Insights.

 
Call for views on cyber security in supply chains and managed service providers

On 17 May DCMS published a call for views on cyber security in supply chains and managed service providers, which focuses on two aspects of supply chain cyber security:

  1. how organisations across the market manage supply chain cyber risk and what additional government intervention would enable organisations to do this more effectively; and
  2. the suitability of a proposed framework for Managed Service Provider security and how this framework could most appropriately be implemented to ensure adequate baseline security to manage the risks associated with Managed Service Providers.

Section 1C: Supplier Assurance refers to the National Cyber Security Centre's supplier assurance questions, which cover the priority areas organisations should consider when ensuring their suppliers have appropriate cyber security protocols in place.  This includes a personal data section, which sets out questions to establish whether a supplier handles or processes any personal data as part of their service to an organisation, and if so, whether it meets the GDPR security principles.  The consultation asks whether these questions should cover any additional areas.

As always, we will monitor the progress of this call for views and report on any relevant developments in a future issue of DWF Data Protection Insights.

DCMS publishes draft Online Safety Bill
On 12 May DCMS published the draft Online Safety Bill.  If it becomes law, this will require providers of "user-to-user" and search services to prevent the proliferation of illegal content and activity online and ensure that children who use their services are not exposed to harmful content. The draft Bill will now be scrutinised by a parliamentary joint committee before being introduced to Parliament.

The child protection provisions complement the ICO's Age-Appropriate Design Code, commonly known as the Children's Code.  See the section ICO blogpost: Applying the Children’s Code harms framework: a gaming sector case study above for an update on the code.
UK government publishes public sector guidance on automated decision-making

On 13 May the Cabinet Office, the Central Digital and Data Office and the Office for Artificial Intelligence published a guidance document: Ethics, Transparency and Accountability Framework for Automated Decision-Making for use by government departments.  The framework was created in response to a review which found that the government should produce clearer guidance on using artificial intelligence (AI) ethically in the public sector.

The guidance starts by distinguishing between:

  • Solely automated decision-making - decisions that are fully automated with no human judgment; and
  • Automated assisted decision-making – automated or algorithmic systems that assist human judgment and decision-making.

The guidance applies to both types of automated decision-making, but notes that the GDPR gives individuals the right (with limited exceptions) not to be subject to solely automated decisions which result in a legal or similarly significant effect.

The guidance sets out a seven-step framework process to follow when using automated decision-making:

1. Test to avoid any unintended outcomes or consequences - prototype and test your algorithm or system so that it is fully understood, robust, sustainable and delivers the intended policy outcomes (and unintended consequences are identified).  Conduct DPIAs (data protection impact assessments) where appropriate.
2. Deliver fair services for all users and citizens - involve a multidisciplinary and diverse team in the development of the algorithm or system to spot and counter prejudices, bias and discrimination.  Areas of potential bias overlap with special category data, e.g. race, ethnicity, sexual orientation and political or religious belief.
3. Be clear who is responsible - work on the assumption that every significant automated decision should be agreed by a minister and all major processes and services being considered for automation should have a senior owner.  Where the decision-making involves the use of personal data, the relevant DPO (data protection officer) should be involved.
4. Handle data safely and protect citizens’ interests - ensure that the algorithm or system adequately protects and handles data safely, and fully complies with data protection legislation.  The department needs to ensure that:

  • implementation aligns with the government Data Ethics Framework;
  • the algorithm and system keep data secure and comply with data protection law; and
  • data governance processes adhere to data protection law - this includes the core principle of data protection by design and default, and where required, completion of a DPIA.

5. Help users and citizens understand how it impacts them - under data protection law, for fully automated processes, you are required to give individuals specific information about the process.  The guidance states that you should work on the basis of a 'presumption of publication' for all algorithms that enable automated decision-making, notifying citizens in plain English when a process or service uses automated decision-making.
6. Ensure that you are compliant with the law – as well as data protection law, this includes the Equality Act 2010 and the Public Sector Equality Duty.
7. Build something that is future proof - continuously monitor the algorithm or system, institute formal review points (recommended at least quarterly) and end user challenge to ensure it delivers the intended outcomes and mitigates against unintended consequences that may develop over time.

As well as setting out this framework, the guidance contains some general points to consider:

  • Algorithms are not the solution to every policy problem - they should not be the go-to solution to resolve the most complex and difficult issues because of the high risk associated with them; and
  • The risks are dependent on policy areas and context - senior owners should conduct a thorough risk assessment, exploring all options.  You should be confident that the policy intent, specification or outcome will be best achieved through an automated or algorithmic decision-making system.

If you need advice on implementing any form of automated decision making in your organisation, please contact JP Buckley.

Further Reading