Automated individual decision-making and profiling [on case C-634/21 - SCHUFA (Scoring)]

Detalhes bibliográficos
Autor(a) principal: Silveira, Alessandra
Data de Publicação: 2023
Tipo de documento: Artigo
Idioma: eng
Título da fonte: Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
Texto Completo: https://doi.org/10.21814/unio.8.2.4842
Resumo: Automated decision-making and profiling are finally being considered before the Court of Justice of the European Union (CJEU). Article 22 GDPR states that “the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” – but its provisions raise many doubts to the legal doctrine – and to the referring court in the SCHUFA case. The problem with this case law lies in the opacity of inferences or predictions resulting from data analysis, particularly by AI systems – inferences whose application to everyday situations determines how each of us, as personal data subjects, are perceived and evaluated by others. The CJEU has the opportunity to assess the existence of legal remedies to challenge operations which result in automated inferences that are not reasonably justified. However, the effectiveness of the application of the GDPR to inferred data faces several obstacles. This has to do with fact that the GDPR was designed for data provided directly by the data subject – and not for data inferred by digital technologies such as AI systems. This is the difficulty behind the Advocate General’s Opinion.
id RCAP_a83a86cc17a3eeb426ce09721af49f43
oai_identifier_str oai:journals.uminho.pt:article/4842
network_acronym_str RCAP
network_name_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository_id_str 7160
spelling Automated individual decision-making and profiling [on case C-634/21 - SCHUFA (Scoring)]ArticleAutomated decision-making and profiling are finally being considered before the Court of Justice of the European Union (CJEU). Article 22 GDPR states that “the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” – but its provisions raise many doubts to the legal doctrine – and to the referring court in the SCHUFA case. The problem with this case law lies in the opacity of inferences or predictions resulting from data analysis, particularly by AI systems – inferences whose application to everyday situations determines how each of us, as personal data subjects, are perceived and evaluated by others. The CJEU has the opportunity to assess the existence of legal remedies to challenge operations which result in automated inferences that are not reasonably justified. However, the effectiveness of the application of the GDPR to inferred data faces several obstacles. This has to do with fact that the GDPR was designed for data provided directly by the data subject – and not for data inferred by digital technologies such as AI systems. This is the difficulty behind the Advocate General’s Opinion.UMinho Editora2023-04-30info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/articlehttps://doi.org/10.21814/unio.8.2.4842eng2183-3435Silveira, Alessandrainfo:eu-repo/semantics/openAccessreponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãoinstacron:RCAAP2023-06-09T16:15:13Zoai:journals.uminho.pt:article/4842Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireopendoar:71602024-03-19T18:00:02.708548Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãofalse
dc.title.none.fl_str_mv Automated individual decision-making and profiling [on case C-634/21 - SCHUFA (Scoring)]
title Automated individual decision-making and profiling [on case C-634/21 - SCHUFA (Scoring)]
spellingShingle Automated individual decision-making and profiling [on case C-634/21 - SCHUFA (Scoring)]
Silveira, Alessandra
Article
title_short Automated individual decision-making and profiling [on case C-634/21 - SCHUFA (Scoring)]
title_full Automated individual decision-making and profiling [on case C-634/21 - SCHUFA (Scoring)]
title_fullStr Automated individual decision-making and profiling [on case C-634/21 - SCHUFA (Scoring)]
title_full_unstemmed Automated individual decision-making and profiling [on case C-634/21 - SCHUFA (Scoring)]
title_sort Automated individual decision-making and profiling [on case C-634/21 - SCHUFA (Scoring)]
author Silveira, Alessandra
author_facet Silveira, Alessandra
author_role author
dc.contributor.author.fl_str_mv Silveira, Alessandra
dc.subject.por.fl_str_mv Article
topic Article
description Automated decision-making and profiling are finally being considered before the Court of Justice of the European Union (CJEU). Article 22 GDPR states that “the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” – but its provisions raise many doubts to the legal doctrine – and to the referring court in the SCHUFA case. The problem with this case law lies in the opacity of inferences or predictions resulting from data analysis, particularly by AI systems – inferences whose application to everyday situations determines how each of us, as personal data subjects, are perceived and evaluated by others. The CJEU has the opportunity to assess the existence of legal remedies to challenge operations which result in automated inferences that are not reasonably justified. However, the effectiveness of the application of the GDPR to inferred data faces several obstacles. This has to do with fact that the GDPR was designed for data provided directly by the data subject – and not for data inferred by digital technologies such as AI systems. This is the difficulty behind the Advocate General’s Opinion.
publishDate 2023
dc.date.none.fl_str_mv 2023-04-30
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/article
format article
status_str publishedVersion
dc.identifier.uri.fl_str_mv https://doi.org/10.21814/unio.8.2.4842
url https://doi.org/10.21814/unio.8.2.4842
dc.language.iso.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv 2183-3435
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.publisher.none.fl_str_mv UMinho Editora
publisher.none.fl_str_mv UMinho Editora
dc.source.none.fl_str_mv reponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron:RCAAP
instname_str Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron_str RCAAP
institution RCAAP
reponame_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
collection Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository.name.fl_str_mv Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
repository.mail.fl_str_mv
_version_ 1799131669528051712