Search
Close this search box.

UK Data Protection Regime – Staying adequate whilst going it alone

Data law

The UK wants its own flexible and pragmatic DP regime. Can it achieve this whilst maintaining adequacy under EU law?

Since its withdrawal from the EU on 1 Jan 2021, the UK has retained the GDPR in UK law as the UK GDPR sitting alongside the Data Protection Act 2018 (the “DPA 2018”). UK GDPR applies to the processing of personal data by UK-based organisations and to organisations outside the UK carrying out processing related to offering goods or services to individuals in the UK or the monitoring of individuals in the UK. The DPA 2018 came into force on 25 May 2018, supplementing the GDPR and adapting it in places. It sets out three DP regimes for the UK, the most relevant to the majority of organisations being Part 2 covering General Processing. Both pieces of legislation apply to make up the UK’s current DP law framework. On 28 June 2021, the EU adopted decisions[1] on the UK’s adequacy under the EU GDPR and the Law Enforcement Directive (“LED”). In those decisions, the EU makes it clear they are given on the basis that the UK has adopted GDPR and LED into UK law and that they will expire in 4 years’ time when the UK will be reassessed for adequacy.

Having departed the EU, the UK government has said it now wants to take a more “common sense”[2] approach to its data laws. On 10th September 2021, it issued a consultation document entitled ‘Data – a new direction’ (“Data Consultation”)[3] that sets outs proposed reforms to the UK’s data laws and potential deviation from GDPR. In it, it says it wants to “make the most of data’s many opportunities” and create a “pro-growth” and “innovation-friendly” regime in the UK.

So, what is it the UK government wants to change? A key area the Data Consultation has identified as needing reform is DP governance of AI.  To this end, the government says it will publish a National AI Strategy later this year. Chapter 1 of the Data Consultation entitled “Reducing barriers to responsible innovation” proposes certain changes to GDPR provisions that the government considers may be stifling innovation in the AI and machine learning space and that have caused uncertainty and ambiguity for stakeholders using AI systems.

First, the government has asked for views from stakeholders on how the concept of fairness is applied to AI systems. Fairness is noted as being much broader than just a DP concept in terms of fair use of data and transparency. It touches many other areas e.g., equality and discrimination laws, employment law, procedural fairness under administrative law, and human rights law (e.g.  under the ECHR). The government has said it is concerned that there is uncertainty about the scope and substance of ‘fairness’ in the DP regime as applied to AI systems and the ICO’s regulatory reach and that there is a risk of regulatory confusion because the current regulatory framework could allow for a very broad interpretation of fairness (fair data use, fairness of process and outcome fairness) in a DP context for AI systems.

A further interesting area for consultation for AI system developers is the extent to which the law should allow organisations to use personal data more freely, subject to appropriate safeguards, for the purposes of training and testing AI.  The government does not identify any particular compliance challenge with respect to UK GDPR but says there is uncertainty among AI users as to how a given activity fits into the current DP regime and whether that activity is permitted or not.

The Data Consultation also considers whether Article 22 of UK GDPR (rules to protect individuals around automated individual decision-making) should remain as part of UK law. Currently, Article 22 protects individuals subject to solely automated decision making from not having a decision made against them that has either legal or “similarly significant” effects including profiling (unless an exception applies). Where Article 22 applies, individuals must be given information about the processing and access to human intervention or a means to challenge the decision. The government states that as automated decision making is likely to increase greatly in years to come particularly within AI/ML systems, the need to maintain the capability to provide human review may not be practicable or proportionate. The Taskforce on Innovation, Growth and Regulatory Reform (“TIGRR”) has recommended that Article 22 be removed from UK law[4]. It recommends that UK law permits use of solely automated AI systems on the bases of legitimate interests or public interest. This would allow solely automated decision making in relation to personal data where there is a legal or “similarly significant” effect if other requirements of DP legislation are met, including those of lawful bases for processing in Article 6 (1) UK GDPR and Articles 9 and 10 (as supplemented under Schedule 1 of the 2018 DPA) in relation to sensitive personal data where applicable.

Another area the Data Consultation looks at is the application of DP law to scientific research. It considers the potential consolidation of research-specific provisions in UK DP legislation and the creation of a statutory definition of “scientific research” to create more certainty and clarity for researchers. It also looks at allowing data subjects to consent to broader areas of scientific research when it is not possible to fully identify the purpose of personal data processing at the time of data collection, creating a new separate lawful ground for processing for research purposes and allowing further processing (re-using personal data) for research purposes.

The UK government is also considering the creation of a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test so that they may process personal data for those stated purposes – citing the example of Singapore where there are defined types of processing activity that are already regarded as legitimate interests of the data controller. The suggested list for the UK includes the reporting of criminal acts, monitoring, detecting and correcting bias in AI systems, product safety, and improving and reviewing an organisation’s network security, among others.

The Data Consultation clearly signals a desire to promote considerably wider uses of data, innovation and scientific research; however, the government does also stress it believes it can achieve all this whilst maintaining world-leading data protection standards, and significantly, adequacy in the eyes of the EU. How far this is possible remains to be seen and some of the proposals have already been met with opposition and concerns over the potential for compromising on the protection of individuals’ rights.[5]  Watch this space.

 

Footnotes:

[1] Adequacy decisions | European Commission (europa.eu)

[2] Post-Brexit data-protection regime to deliver ‘common sense, not box-ticking’ | PublicTechnology.net

[3] Data: a new direction (publishing.service.gov.uk)

[4] See Chapter 1, para 101 of the Data Consultation – Data: a new direction (publishing.service.gov.uk)

[5] See UK taskforce calls for cutting GDPR protections (computerweekly.com)

Share:

More Posts

Digital transformation

SCHUFA and Automated Decision Making

Can I still use automated processes to produce outputs, such as scoring? Marija Nonkovic takes a look in light of December’s SCHUFA judgment. https://youtu.be/rIHABI8VlNo

Send Us A Message