Aktiviteter per år
Sammanfattning
The recent rise of AI, profiling and online surveillance show huge shifts in both the technical approaches to and cultural acceptance of the processing of personal data. These changes have exposed a number of holes in the legal protection for data subjects, which has in turn led to new legal developments: The General Data Protection Regulation was introduced to update the legal regime to deal with new technological challenges and provide a strong protection to data subjects, while the CJEU has adopted a wide interpretive stance to ensure a strong, efficient and adaptive protection.
There is, however, a risk of overcorrection. While data subject rights are undeniably extremely important, there are other stakeholder interests which must be respected, such as the right to carry out a business or to perform research and development into new technology. These are both socially useful activities and we must be careful not to create a legal regime which unduly stifles them. Moreover, more restrictive rules may end up providing an even less restrictive regime by overly relying on consent, thus pushing the burden onto data subjects who do not have the time or inclination to read terms and conditions.
This paper will examine these risks, particularly in relation to the definition of personal data under the GDPR, art. 4(1) and the right relating to automated individual decision-making under the GDPR, art. 22. The former was subject to considerable expansion under the Data Protection Directive (see, e.g., C-434/16 Nowak and C-528/14 Breyer), while the latter contains a number of ambiguities with a wide range of possible interpretations. Both are key areas that are vulnerable to overcorrections due to a desire to protect data subjects from rapidly advancing technology, but where doing so may simply cause more harm than good.
There is, however, a risk of overcorrection. While data subject rights are undeniably extremely important, there are other stakeholder interests which must be respected, such as the right to carry out a business or to perform research and development into new technology. These are both socially useful activities and we must be careful not to create a legal regime which unduly stifles them. Moreover, more restrictive rules may end up providing an even less restrictive regime by overly relying on consent, thus pushing the burden onto data subjects who do not have the time or inclination to read terms and conditions.
This paper will examine these risks, particularly in relation to the definition of personal data under the GDPR, art. 4(1) and the right relating to automated individual decision-making under the GDPR, art. 22. The former was subject to considerable expansion under the Data Protection Directive (see, e.g., C-434/16 Nowak and C-528/14 Breyer), while the latter contains a number of ambiguities with a wide range of possible interpretations. Both are key areas that are vulnerable to overcorrections due to a desire to protect data subjects from rapidly advancing technology, but where doing so may simply cause more harm than good.
Originalspråk | engelska |
---|---|
Titel på värdpublikation | SLS Conference 2018 : Law in Troubled Times |
Utgivningsdatum | 2018 |
Status | Publicerad - 2018 |
MoE-publikationstyp | B3 Ej refererad artikel i konferenshandlingar |
Evenemang | SLS Conference 2018 - Queen Mary University, London, Storbritannien Varaktighet: 4 sep. 2018 → 7 sep. 2018 |
Vetenskapsgrenar
- 513 Juridik
Aktiviteter
- 1 !!Oral presentation
-
Avoiding The Overcorrection: The Rise of AI, Data Protection Law and the Problem With Overprotecting Data Subjects
Sam Wrigley (!!Speaker)
7 sep. 2018Aktivitet: Typer för tal eller presentation › !!Oral presentation