Avoiding The Overcorrection: The Rise of AI, Data Protection Law and the Problem With Overprotecting Data Subjects

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientific

Abstract

The recent rise of AI, profiling and online surveillance show huge shifts in both the technical approaches to and cultural acceptance of the processing of personal data. These changes have exposed a number of holes in the legal protection for data subjects, which has in turn led to new legal developments: The General Data Protection Regulation was introduced to update the legal regime to deal with new technological challenges and provide a strong protection to data subjects, while the CJEU has adopted a wide interpretive stance to ensure a strong, efficient and adaptive protection.

There is, however, a risk of overcorrection. While data subject rights are undeniably extremely important, there are other stakeholder interests which must be respected, such as the right to carry out a business or to perform research and development into new technology. These are both socially useful activities and we must be careful not to create a legal regime which unduly stifles them. Moreover, more restrictive rules may end up providing an even less restrictive regime by overly relying on consent, thus pushing the burden onto data subjects who do not have the time or inclination to read terms and conditions.
This paper will examine these risks, particularly in relation to the definition of personal data under the GDPR, art. 4(1) and the right relating to automated individual decision-making under the GDPR, art. 22. The former was subject to considerable expansion under the Data Protection Directive (see, e.g., C-434/16 Nowak and C-528/14 Breyer), while the latter contains a number of ambiguities with a wide range of possible interpretations. Both are key areas that are vulnerable to overcorrections due to a desire to protect data subjects from rapidly advancing technology, but where doing so may simply cause more harm than good.
Original languageEnglish
Title of host publicationSLS Conference 2018 : Law in Troubled Times
Publication date2020
Publication statusPublished - 2020
MoE publication typeB3 Article in conference proceedings
EventSLS Conference 2018 - Queen Mary University, London, United Kingdom
Duration: 4 Sep 20187 Sep 2018

Fields of Science

  • 513 Law

Cite this

@inproceedings{19d2c26e97ea4541880be41688433137,
title = "Avoiding The Overcorrection: The Rise of AI, Data Protection Law and the Problem With Overprotecting Data Subjects",
abstract = "The recent rise of AI, profiling and online surveillance show huge shifts in both the technical approaches to and cultural acceptance of the processing of personal data. These changes have exposed a number of holes in the legal protection for data subjects, which has in turn led to new legal developments: The General Data Protection Regulation was introduced to update the legal regime to deal with new technological challenges and provide a strong protection to data subjects, while the CJEU has adopted a wide interpretive stance to ensure a strong, efficient and adaptive protection.There is, however, a risk of overcorrection. While data subject rights are undeniably extremely important, there are other stakeholder interests which must be respected, such as the right to carry out a business or to perform research and development into new technology. These are both socially useful activities and we must be careful not to create a legal regime which unduly stifles them. Moreover, more restrictive rules may end up providing an even less restrictive regime by overly relying on consent, thus pushing the burden onto data subjects who do not have the time or inclination to read terms and conditions.This paper will examine these risks, particularly in relation to the definition of personal data under the GDPR, art. 4(1) and the right relating to automated individual decision-making under the GDPR, art. 22. The former was subject to considerable expansion under the Data Protection Directive (see, e.g., C-434/16 Nowak and C-528/14 Breyer), while the latter contains a number of ambiguities with a wide range of possible interpretations. Both are key areas that are vulnerable to overcorrections due to a desire to protect data subjects from rapidly advancing technology, but where doing so may simply cause more harm than good.",
keywords = "513 Law",
author = "Wrigley, {John Samuel Peter}",
year = "2020",
language = "English",
booktitle = "SLS Conference 2018",

}

Wrigley, JSP 2020, Avoiding The Overcorrection: The Rise of AI, Data Protection Law and the Problem With Overprotecting Data Subjects. in SLS Conference 2018: Law in Troubled Times. SLS Conference 2018, London, United Kingdom, 04/09/2018.

Avoiding The Overcorrection : The Rise of AI, Data Protection Law and the Problem With Overprotecting Data Subjects. / Wrigley, John Samuel Peter.

SLS Conference 2018: Law in Troubled Times. 2020.

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientific

TY - GEN

T1 - Avoiding The Overcorrection

T2 - The Rise of AI, Data Protection Law and the Problem With Overprotecting Data Subjects

AU - Wrigley, John Samuel Peter

PY - 2020

Y1 - 2020

N2 - The recent rise of AI, profiling and online surveillance show huge shifts in both the technical approaches to and cultural acceptance of the processing of personal data. These changes have exposed a number of holes in the legal protection for data subjects, which has in turn led to new legal developments: The General Data Protection Regulation was introduced to update the legal regime to deal with new technological challenges and provide a strong protection to data subjects, while the CJEU has adopted a wide interpretive stance to ensure a strong, efficient and adaptive protection.There is, however, a risk of overcorrection. While data subject rights are undeniably extremely important, there are other stakeholder interests which must be respected, such as the right to carry out a business or to perform research and development into new technology. These are both socially useful activities and we must be careful not to create a legal regime which unduly stifles them. Moreover, more restrictive rules may end up providing an even less restrictive regime by overly relying on consent, thus pushing the burden onto data subjects who do not have the time or inclination to read terms and conditions.This paper will examine these risks, particularly in relation to the definition of personal data under the GDPR, art. 4(1) and the right relating to automated individual decision-making under the GDPR, art. 22. The former was subject to considerable expansion under the Data Protection Directive (see, e.g., C-434/16 Nowak and C-528/14 Breyer), while the latter contains a number of ambiguities with a wide range of possible interpretations. Both are key areas that are vulnerable to overcorrections due to a desire to protect data subjects from rapidly advancing technology, but where doing so may simply cause more harm than good.

AB - The recent rise of AI, profiling and online surveillance show huge shifts in both the technical approaches to and cultural acceptance of the processing of personal data. These changes have exposed a number of holes in the legal protection for data subjects, which has in turn led to new legal developments: The General Data Protection Regulation was introduced to update the legal regime to deal with new technological challenges and provide a strong protection to data subjects, while the CJEU has adopted a wide interpretive stance to ensure a strong, efficient and adaptive protection.There is, however, a risk of overcorrection. While data subject rights are undeniably extremely important, there are other stakeholder interests which must be respected, such as the right to carry out a business or to perform research and development into new technology. These are both socially useful activities and we must be careful not to create a legal regime which unduly stifles them. Moreover, more restrictive rules may end up providing an even less restrictive regime by overly relying on consent, thus pushing the burden onto data subjects who do not have the time or inclination to read terms and conditions.This paper will examine these risks, particularly in relation to the definition of personal data under the GDPR, art. 4(1) and the right relating to automated individual decision-making under the GDPR, art. 22. The former was subject to considerable expansion under the Data Protection Directive (see, e.g., C-434/16 Nowak and C-528/14 Breyer), while the latter contains a number of ambiguities with a wide range of possible interpretations. Both are key areas that are vulnerable to overcorrections due to a desire to protect data subjects from rapidly advancing technology, but where doing so may simply cause more harm than good.

KW - 513 Law

M3 - Conference contribution

BT - SLS Conference 2018

ER -