Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais

Detalhes bibliográficos
Autor(a) principal: Korkmaz, Maria Regina Detoni Cavalcanti Rigolon
Data de Publicação: 2022
Outros Autores: mariareginadcr@gmail.com
Tipo de documento: Tese
Idioma: por
Título da fonte: Biblioteca Digital de Teses e Dissertações da UERJ
Texto Completo: http://www.bdtd.uerj.br/handle/1/18276
Resumo: This Ph.D. dissertation aims to analyze, in a functional perspective, the legal regime of the review of automated decision-making, provided by the Brazilian General Data Protection Law (Law n. 13,709 of 2018, acronym “LGPD”). A deductive method of qualitative matter was adopted based on Stefano Rodotà, notably on the diagnosis of the "dictatorship of algorithms", with the need to develop prerogatives for the protection of the person. This was done in attention to the premises of civil-constitutional law to investigate whether the review of automated decision-making, as conceived in the LGPD, holds normative bases of a substantial protection mechanism. The evolution of privacy up to the conception of personal data protection as an autonomous fundamental right was addressed. In addition, the need to apprehend the phenomenon from a collective perspective was presented from the collective dimension of privacy and the protection of personal data. This perspective considered the scenario of opacity and asymmetry of information and power between data subjects and processing agents. The protection of the individual, in the field of automated decision-making, demands progressive scrutiny not only of the data input, but, above all, of its processing and output, which must go beyond the categories historically associated with discrimination. To demonstrate that, the insufficiency of the category of sensitive data was presented. Certain topics in the field of artificial intelligence, its applications, and the conception of algorithms were outlined to identify their interface with the regulatory perspective. To summarize the problems posed by the automation of decisions, we used, as an investigative guideline, a map of ethical problems structured based on the following challenges: inconclusive evidence; inscrutable evidence; misguided evidence; discriminatory results; transformative effects; and traceability. Additionally, contributions from the philosophy of language were presented, especially to identify the underlying limits of an automated decision process. The legal regime of automated decision-making in the LGPD was analyzed. The objectives and fundamentals of the law were presented, and its principiology was analyzed in interface with the field of automated decision-making. Aspects of other rights of data subjects were presented, especially those with direct repercussions on the automated decision-making regime. The assumptions for the qualification of a decision as totally automated were examined in comparison with the European model on the theme, as well as the right to explanation, as an assumption for the exercise of the review and for the protection of fundamental rights. The review of an automated decision was analyzed, in light of constitutional legality, concluding with the presentation of three proposals to extract the substantial exercise of the LGPD review: (i) the importance of human intervention for the substantiality of the review; (ii) the conditioning of the review to a systemic regime of responsibility and accountability with the establishment of parameters; and (iii) the association of the review to a comprehensive regime of the physiology of legal situations, as well as to a collective paradigm of protection beyond the individual perspective.
id UERJ_d03246a58dc29445ee2f89b2f693aa2d
oai_identifier_str oai:www.bdtd.uerj.br:1/18276
network_acronym_str UERJ
network_name_str Biblioteca Digital de Teses e Dissertações da UERJ
repository_id_str 2903
spelling Schreiber, Andersonhttp://lattes.cnpq.br/7308120828952768Negri, Sergio Marcos Carvalho de Ávilahttp://lattes.cnpq.br/3282764176353256Tepedino, Gustavo José Mendeshttp://lattes.cnpq.br/8832153442752468Souza, Carlos Affonso Pereira dehttp://lattes.cnpq.br/1819360839405757Mulholland, Caitlin Sampaiohttp://lattes.cnpq.br/5010668019661821Branco Júnior, Sérgio Vieirahttp://lattes.cnpq.br/1052318679517461http://lattes.cnpq.br/2016572586662958Korkmaz, Maria Regina Detoni Cavalcanti Rigolonmariareginadcr@gmail.com2022-08-25T14:16:47Z2024-07-292022-07-26KORKMAZ, Maria Regina Detoni Cavalcanti Rigolon. Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais. 2022. 329 f. Tese (Doutorado em Direito) - Faculdade de Direito, Universidade do Estado do Rio de Janeiro, Rio de Janeiro, 2022.http://www.bdtd.uerj.br/handle/1/18276This Ph.D. dissertation aims to analyze, in a functional perspective, the legal regime of the review of automated decision-making, provided by the Brazilian General Data Protection Law (Law n. 13,709 of 2018, acronym “LGPD”). A deductive method of qualitative matter was adopted based on Stefano Rodotà, notably on the diagnosis of the "dictatorship of algorithms", with the need to develop prerogatives for the protection of the person. This was done in attention to the premises of civil-constitutional law to investigate whether the review of automated decision-making, as conceived in the LGPD, holds normative bases of a substantial protection mechanism. The evolution of privacy up to the conception of personal data protection as an autonomous fundamental right was addressed. In addition, the need to apprehend the phenomenon from a collective perspective was presented from the collective dimension of privacy and the protection of personal data. This perspective considered the scenario of opacity and asymmetry of information and power between data subjects and processing agents. The protection of the individual, in the field of automated decision-making, demands progressive scrutiny not only of the data input, but, above all, of its processing and output, which must go beyond the categories historically associated with discrimination. To demonstrate that, the insufficiency of the category of sensitive data was presented. Certain topics in the field of artificial intelligence, its applications, and the conception of algorithms were outlined to identify their interface with the regulatory perspective. To summarize the problems posed by the automation of decisions, we used, as an investigative guideline, a map of ethical problems structured based on the following challenges: inconclusive evidence; inscrutable evidence; misguided evidence; discriminatory results; transformative effects; and traceability. Additionally, contributions from the philosophy of language were presented, especially to identify the underlying limits of an automated decision process. The legal regime of automated decision-making in the LGPD was analyzed. The objectives and fundamentals of the law were presented, and its principiology was analyzed in interface with the field of automated decision-making. Aspects of other rights of data subjects were presented, especially those with direct repercussions on the automated decision-making regime. The assumptions for the qualification of a decision as totally automated were examined in comparison with the European model on the theme, as well as the right to explanation, as an assumption for the exercise of the review and for the protection of fundamental rights. The review of an automated decision was analyzed, in light of constitutional legality, concluding with the presentation of three proposals to extract the substantial exercise of the LGPD review: (i) the importance of human intervention for the substantiality of the review; (ii) the conditioning of the review to a systemic regime of responsibility and accountability with the establishment of parameters; and (iii) the association of the review to a comprehensive regime of the physiology of legal situations, as well as to a collective paradigm of protection beyond the individual perspective.A tese tem por fim analisar, em uma perspectiva funcional, o regime jurídico da revisão de decisões automatizadas, previsto pela Lei Geral de Proteção de Dados Pessoais (Lei n. 13.709 de 2018, acrônimo “LGPD”). A partir de Stefano Rodotà, notadamente sobre o diagnóstico da “ditadura dos algoritmos”, com a necessidade de desenvolver prerrogativas para a proteção da pessoa, adotou-se o método dedutivo, de caráter qualitativo e em atenção às premissas do direito civil-constitucional para investigar se a revisão de decisões automatizadas, tal como concebida na LGPD, detém bases normativas de um mecanismo de tutela substancial. A evolução que se desenhou da privacidade até a concepção de proteção de dados pessoais como direito fundamental autônomo foi abordada. Ainda, a necessidade de apreensão do fenômeno a partir de uma perspectiva coletiva, considerando o cenário de opacidade e de assimetria informacional e de poder entre titulares de dados e agentes de tratamento, foi apresentada a partir da dimensão coletiva da privacidade e da proteção de dados pessoais. A insuficiência da categoria de dados sensíveis foi apresentada com o propósito de demonstrar que a proteção da pessoa, no campo das decisões automatizadas, demanda progressivo escrutínio não apenas do input de dados, mas, sobretudo, de seu processamento e do output, o que deverá se dar para além das categorias, historicamente, associadas à discriminação. Conceitos do campo da inteligência artificial, de suas aplicações e da concepção de algoritmo foram delimitados neste trabalho a fim de identificar sua interface com a perspectiva regulatória. Com o intuito de sumarizar os problemas postos pela automatização de decisões, utilizou-se, como diretriz investigativa, mapa de problemas éticos estruturado com base nos seguintes desafios: evidências inconclusivas; evidências inescrutáveis; evidências mal orientadas; resultados discriminatórios; efeitos transformativos; e rastreabilidade. Adicionalmente, foram apresentadas contribuições da filosofia da linguagem, especialmente para identificar os limites subjacentes a um processo de decisão automatizado. Ademais, o regime jurídico das decisões automatizadas na LGPD foi analisado. Foram apresentados os objetivos e os fundamentos da norma, bem como analisada a sua principiologia em interface com o campo das decisões automatizadas. Aspectos de outros direitos dos titulares foram apresentados, especialmente aqueles com repercussões diretas no regime das decisões automatizadas. Os pressupostos para a qualificação de uma decisão como totalmente automatizada foram examinados, em cotejo com o modelo europeu sobre o tema, bem como o direito à explicação, enquanto pressuposto para o exercício da revisão e para a proteção de direitos fundamentais. A revisão de decisão automatizada foi analisada, à luz da legalidade constitucional, concluindo-se com a apresentação de três propostas para extrair o exercício substancial da revisão da LGPD: (i) importância da intervenção humana para a substancialidade da revisão; (ii) condicionamento da revisão a um regime sistêmico de responsabilidade e de prestação de contas, com o estabelecimento de parâmetros; e (iii) associação da revisão a um regime abrangente da fisiologia das situações jurídicas, bem como a um paradigma coletivo de proteção, para além da perspectiva individual.Submitted by Marcela CCS/C (bdtdccsc@gmail.com) on 2022-08-25T14:16:47Z No. of bitstreams: 2 Tese - Maria Regina Detoni Cavalcanti Rigolon Korkmaz - 2022 - Parcial.pdf: 631721 bytes, checksum: fbc85fc449830a7eba7ee1ad6ac9dd7b (MD5) Tese - Maria Regina Detoni Cavalcanti Rigolon Korkmaz - 2022 - Completa.pdf: 2616600 bytes, checksum: f0a623c5643bb2c1a6f92a97bb925566 (MD5)Made available in DSpace on 2022-08-25T14:16:47Z (GMT). No. of bitstreams: 2 Tese - Maria Regina Detoni Cavalcanti Rigolon Korkmaz - 2022 - Parcial.pdf: 631721 bytes, checksum: fbc85fc449830a7eba7ee1ad6ac9dd7b (MD5) Tese - Maria Regina Detoni Cavalcanti Rigolon Korkmaz - 2022 - Completa.pdf: 2616600 bytes, checksum: f0a623c5643bb2c1a6f92a97bb925566 (MD5) Previous issue date: 2022-07-26application/pdfporUniversidade do Estado do Rio de JaneiroPrograma de Pós-Graduação em DireitoUERJBrasilCentro de Ciências Sociais::Faculdade de DireitoAutomated decision-makingRight to reviewRight to explanationHuman interventionCollective dimension of privacy and personal data protectionFundamental right to the protection of personal dataDecisões automatizadasDireito à revisãoDireito à explicaçãoIntervenção humanaDimensão coletiva da privacidade e da proteção de dados pessoaisDireito fundamental à proteção de dados pessoaisCIENCIAS SOCIAIS APLICADAS::DIREITO::DIREITO PRIVADO::DIREITO CIVILRevisão de decisões automatizadas na Lei Geral de Proteção de Dados PessoaisRight to review of automated decision-making in the Brazilian General Data Protection Lawinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/doctoralThesisinfo:eu-repo/semantics/embargoedAccessreponame:Biblioteca Digital de Teses e Dissertações da UERJinstname:Universidade do Estado do Rio de Janeiro (UERJ)instacron:UERJORIGINALTese - Maria Regina Detoni Cavalcanti Rigolon Korkmaz - 2022 - Parcial.pdfTese - Maria Regina Detoni Cavalcanti Rigolon Korkmaz - 2022 - Parcial.pdfapplication/pdf631721http://www.bdtd.uerj.br/bitstream/1/18276/2/Tese+-+Maria+Regina+Detoni+Cavalcanti+Rigolon+Korkmaz+-+2022+-+Parcial.pdffbc85fc449830a7eba7ee1ad6ac9dd7bMD52Tese - Maria Regina Detoni Cavalcanti Rigolon Korkmaz - 2022 - Completa.pdfTese - Maria Regina Detoni Cavalcanti Rigolon Korkmaz - 2022 - Completa.pdfapplication/pdf2616600http://www.bdtd.uerj.br/bitstream/1/18276/3/Tese+-+Maria+Regina+Detoni+Cavalcanti+Rigolon+Korkmaz+-+2022+-+Completa.pdff0a623c5643bb2c1a6f92a97bb925566MD53LICENSElicense.txtlicense.txttext/plain; charset=utf-82123http://www.bdtd.uerj.br/bitstream/1/18276/1/license.txte5502652da718045d7fcd832b79fca29MD511/182762024-02-27 13:52:40.16oai:www.bdtd.uerj.br:1/18276Tk9UQTogTElDRU7Dh0EgUkVERSBTSVJJVVMKRXN0YSBsaWNlbsOnYSBkZSBleGVtcGxvIMOpIGZvcm5lY2lkYSBhcGVuYXMgcGFyYSBmaW5zIGluZm9ybWF0aXZvcy4KCkxJQ0VOw4dBIERFIERJU1RSSUJVScOHw4NPIE7Dg08tRVhDTFVTSVZBCgpDb20gYSBhcHJlc2VudGHDp8OjbyBkZXN0YSBsaWNlbsOnYSwgdm9jw6ogKG8gYXV0b3IgKGVzKSBvdSBvIHRpdHVsYXIgZG9zIGRpcmVpdG9zIGRlIGF1dG9yKSBjb25jZWRlIMOgIFVuaXZlcnNpZGFkZSAKZG8gRXN0YWRvIGRvIFJpbyBkZSBKYW5laXJvIChVRVJKKSBvIGRpcmVpdG8gbsOjby1leGNsdXNpdm8gZGUgcmVwcm9kdXppciwgIHRyYWR1emlyIChjb25mb3JtZSBkZWZpbmlkbyBhYmFpeG8pLCBlL291IApkaXN0cmlidWlyIGEgc3VhIHRlc2Ugb3UgZGlzc2VydGHDp8OjbyAoaW5jbHVpbmRvIG8gcmVzdW1vKSBwb3IgdG9kbyBvIG11bmRvIG5vIGZvcm1hdG8gaW1wcmVzc28gZSBlbGV0csO0bmljbyBlIAplbSBxdWFscXVlciBtZWlvLCBpbmNsdWluZG8gb3MgZm9ybWF0b3Mgw6F1ZGlvIG91IHbDrWRlby4KClZvY8OqIGNvbmNvcmRhIHF1ZSBhIFVFUkogcG9kZSwgc2VtIGFsdGVyYXIgbyBjb250ZcO6ZG8sIHRyYW5zcG9yIGEgc3VhIHRlc2Ugb3UgZGlzc2VydGHDp8OjbyAKcGFyYSBxdWFscXVlciBtZWlvIG91IGZvcm1hdG8gcGFyYSBmaW5zIGRlIHByZXNlcnZhw6fDo28uCgpWb2PDqiB0YW1iw6ltIGNvbmNvcmRhIHF1ZSBhIFVFUkogcG9kZSBtYW50ZXIgbWFpcyBkZSB1bWEgY8OzcGlhIGEgc3VhIHRlc2Ugb3UgCmRpc3NlcnRhw6fDo28gcGFyYSBmaW5zIGRlIHNlZ3VyYW7Dp2EsIGJhY2stdXAgZSBwcmVzZXJ2YcOnw6NvLgoKVm9jw6ogZGVjbGFyYSBxdWUgYSBzdWEgdGVzZSBvdSBkaXNzZXJ0YcOnw6NvIMOpIG9yaWdpbmFsIGUgcXVlIHZvY8OqIHRlbSBvIHBvZGVyIGRlIGNvbmNlZGVyIG9zIGRpcmVpdG9zIGNvbnRpZG9zIApuZXN0YSBsaWNlbsOnYS4gVm9jw6ogdGFtYsOpbSBkZWNsYXJhIHF1ZSBvIGRlcMOzc2l0byBkYSBzdWEgdGVzZSBvdSBkaXNzZXJ0YcOnw6NvIG7Do28sIHF1ZSBzZWphIGRlIHNldSAKY29uaGVjaW1lbnRvLCBpbmZyaW5nZSBkaXJlaXRvcyBhdXRvcmFpcyBkZSBuaW5ndcOpbS4KCkNhc28gYSBzdWEgdGVzZSBvdSBkaXNzZXJ0YcOnw6NvIGNvbnRlbmhhIG1hdGVyaWFsIHF1ZSB2b2PDqiBuw6NvIHBvc3N1aSBhIHRpdHVsYXJpZGFkZSBkb3MgZGlyZWl0b3MgYXV0b3JhaXMsIHZvY8OqIApkZWNsYXJhIHF1ZSBvYnRldmUgYSBwZXJtaXNzw6NvIGlycmVzdHJpdGEgZG8gZGV0ZW50b3IgZG9zIGRpcmVpdG9zIGF1dG9yYWlzIHBhcmEgY29uY2VkZXIgw6AgVUVSSiBvcyBkaXJlaXRvcyBhcHJlc2VudGFkb3MgbmVzdGEgbGljZW7Dp2EsIGUgcXVlIGVzc2UgbWF0ZXJpYWwgZGUgcHJvcHJpZWRhZGUgZGUgdGVyY2Vpcm9zIGVzdMOhIGNsYXJhbWVudGUgCmlkZW50aWZpY2FkbyBlIHJlY29uaGVjaWRvIG5vIHRleHRvIG91IG5vIGNvbnRlw7pkbyBkYSB0ZXNlIG91IGRpc3NlcnRhw6fDo28gb3JhIGRlcG9zaXRhZGEuCgpDQVNPIEEgVEVTRSBPVSBESVNTRVJUQcOHw4NPIE9SQSBERVBPU0lUQURBIFRFTkhBIFNJRE8gUkVTVUxUQURPIERFIFVNIFBBVFJPQ8ONTklPIE9VIApBUE9JTyBERSBVTUEgQUfDik5DSUEgREUgRk9NRU5UTyBPVSBPVVRSTyBPUkdBTklTTU8gUVVFIE7Dg08gU0VKQSBFU1RBClVOSVZFUlNJREFERSwgVk9Dw4ogREVDTEFSQSBRVUUgUkVTUEVJVE9VIFRPRE9TIEUgUVVBSVNRVUVSIERJUkVJVE9TIERFIFJFVklTw4NPIENPTU8gClRBTULDiU0gQVMgREVNQUlTIE9CUklHQcOHw5VFUyBFWElHSURBUyBQT1IgQ09OVFJBVE8gT1UgQUNPUkRPLgoKQSBVbml2ZXJzaWRhZGUgZG8gRXN0YWRvIGRvIFJpbyBkZSBKYW5laXJvIChVRVJKKSBzZSBjb21wcm9tZXRlIGEgaWRlbnRpZmljYXIgY2xhcmFtZW50ZSBvIHNldSBub21lIChzKSBvdSBvKHMpIG5vbWUocykgZG8ocykgCmRldGVudG9yKGVzKSBkb3MgZGlyZWl0b3MgYXV0b3JhaXMgZGEgdGVzZSBvdSBkaXNzZXJ0YcOnw6NvLCBlIG7Do28gZmFyw6EgcXVhbHF1ZXIgYWx0ZXJhw6fDo28sIGFsw6ltIGRhcXVlbGFzIApjb25jZWRpZGFzIHBvciBlc3RhIGxpY2Vuw6dhLgo=Biblioteca Digital de Teses e Dissertaçõeshttp://www.bdtd.uerj.br/PUBhttps://www.bdtd.uerj.br:8443/oai/requestbdtd.suporte@uerj.bropendoar:29032024-02-27T16:52:40Biblioteca Digital de Teses e Dissertações da UERJ - Universidade do Estado do Rio de Janeiro (UERJ)false
dc.title.por.fl_str_mv Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais
dc.title.alternative.eng.fl_str_mv Right to review of automated decision-making in the Brazilian General Data Protection Law
title Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais
spellingShingle Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais
Korkmaz, Maria Regina Detoni Cavalcanti Rigolon
Automated decision-making
Right to review
Right to explanation
Human intervention
Collective dimension of privacy and personal data protection
Fundamental right to the protection of personal data
Decisões automatizadas
Direito à revisão
Direito à explicação
Intervenção humana
Dimensão coletiva da privacidade e da proteção de dados pessoais
Direito fundamental à proteção de dados pessoais
CIENCIAS SOCIAIS APLICADAS::DIREITO::DIREITO PRIVADO::DIREITO CIVIL
title_short Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais
title_full Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais
title_fullStr Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais
title_full_unstemmed Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais
title_sort Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais
author Korkmaz, Maria Regina Detoni Cavalcanti Rigolon
author_facet Korkmaz, Maria Regina Detoni Cavalcanti Rigolon
mariareginadcr@gmail.com
author_role author
author2 mariareginadcr@gmail.com
author2_role author
dc.contributor.advisor1.fl_str_mv Schreiber, Anderson
dc.contributor.advisor1Lattes.fl_str_mv http://lattes.cnpq.br/7308120828952768
dc.contributor.advisor-co1.fl_str_mv Negri, Sergio Marcos Carvalho de Ávila
dc.contributor.advisor-co1Lattes.fl_str_mv http://lattes.cnpq.br/3282764176353256
dc.contributor.referee1.fl_str_mv Tepedino, Gustavo José Mendes
dc.contributor.referee1Lattes.fl_str_mv http://lattes.cnpq.br/8832153442752468
dc.contributor.referee2.fl_str_mv Souza, Carlos Affonso Pereira de
dc.contributor.referee2Lattes.fl_str_mv http://lattes.cnpq.br/1819360839405757
dc.contributor.referee3.fl_str_mv Mulholland, Caitlin Sampaio
dc.contributor.referee3Lattes.fl_str_mv http://lattes.cnpq.br/5010668019661821
dc.contributor.referee4.fl_str_mv Branco Júnior, Sérgio Vieira
dc.contributor.referee4Lattes.fl_str_mv http://lattes.cnpq.br/1052318679517461
dc.contributor.authorLattes.fl_str_mv http://lattes.cnpq.br/2016572586662958
dc.contributor.author.fl_str_mv Korkmaz, Maria Regina Detoni Cavalcanti Rigolon
mariareginadcr@gmail.com
contributor_str_mv Schreiber, Anderson
Negri, Sergio Marcos Carvalho de Ávila
Tepedino, Gustavo José Mendes
Souza, Carlos Affonso Pereira de
Mulholland, Caitlin Sampaio
Branco Júnior, Sérgio Vieira
dc.subject.eng.fl_str_mv Automated decision-making
Right to review
Right to explanation
Human intervention
Collective dimension of privacy and personal data protection
Fundamental right to the protection of personal data
topic Automated decision-making
Right to review
Right to explanation
Human intervention
Collective dimension of privacy and personal data protection
Fundamental right to the protection of personal data
Decisões automatizadas
Direito à revisão
Direito à explicação
Intervenção humana
Dimensão coletiva da privacidade e da proteção de dados pessoais
Direito fundamental à proteção de dados pessoais
CIENCIAS SOCIAIS APLICADAS::DIREITO::DIREITO PRIVADO::DIREITO CIVIL
dc.subject.por.fl_str_mv Decisões automatizadas
Direito à revisão
Direito à explicação
Intervenção humana
Dimensão coletiva da privacidade e da proteção de dados pessoais
Direito fundamental à proteção de dados pessoais
dc.subject.cnpq.fl_str_mv CIENCIAS SOCIAIS APLICADAS::DIREITO::DIREITO PRIVADO::DIREITO CIVIL
description This Ph.D. dissertation aims to analyze, in a functional perspective, the legal regime of the review of automated decision-making, provided by the Brazilian General Data Protection Law (Law n. 13,709 of 2018, acronym “LGPD”). A deductive method of qualitative matter was adopted based on Stefano Rodotà, notably on the diagnosis of the "dictatorship of algorithms", with the need to develop prerogatives for the protection of the person. This was done in attention to the premises of civil-constitutional law to investigate whether the review of automated decision-making, as conceived in the LGPD, holds normative bases of a substantial protection mechanism. The evolution of privacy up to the conception of personal data protection as an autonomous fundamental right was addressed. In addition, the need to apprehend the phenomenon from a collective perspective was presented from the collective dimension of privacy and the protection of personal data. This perspective considered the scenario of opacity and asymmetry of information and power between data subjects and processing agents. The protection of the individual, in the field of automated decision-making, demands progressive scrutiny not only of the data input, but, above all, of its processing and output, which must go beyond the categories historically associated with discrimination. To demonstrate that, the insufficiency of the category of sensitive data was presented. Certain topics in the field of artificial intelligence, its applications, and the conception of algorithms were outlined to identify their interface with the regulatory perspective. To summarize the problems posed by the automation of decisions, we used, as an investigative guideline, a map of ethical problems structured based on the following challenges: inconclusive evidence; inscrutable evidence; misguided evidence; discriminatory results; transformative effects; and traceability. Additionally, contributions from the philosophy of language were presented, especially to identify the underlying limits of an automated decision process. The legal regime of automated decision-making in the LGPD was analyzed. The objectives and fundamentals of the law were presented, and its principiology was analyzed in interface with the field of automated decision-making. Aspects of other rights of data subjects were presented, especially those with direct repercussions on the automated decision-making regime. The assumptions for the qualification of a decision as totally automated were examined in comparison with the European model on the theme, as well as the right to explanation, as an assumption for the exercise of the review and for the protection of fundamental rights. The review of an automated decision was analyzed, in light of constitutional legality, concluding with the presentation of three proposals to extract the substantial exercise of the LGPD review: (i) the importance of human intervention for the substantiality of the review; (ii) the conditioning of the review to a systemic regime of responsibility and accountability with the establishment of parameters; and (iii) the association of the review to a comprehensive regime of the physiology of legal situations, as well as to a collective paradigm of protection beyond the individual perspective.
publishDate 2022
dc.date.accessioned.fl_str_mv 2022-08-25T14:16:47Z
dc.date.issued.fl_str_mv 2022-07-26
dc.date.available.fl_str_mv 2024-07-29
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/doctoralThesis
format doctoralThesis
status_str publishedVersion
dc.identifier.citation.fl_str_mv KORKMAZ, Maria Regina Detoni Cavalcanti Rigolon. Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais. 2022. 329 f. Tese (Doutorado em Direito) - Faculdade de Direito, Universidade do Estado do Rio de Janeiro, Rio de Janeiro, 2022.
dc.identifier.uri.fl_str_mv http://www.bdtd.uerj.br/handle/1/18276
identifier_str_mv KORKMAZ, Maria Regina Detoni Cavalcanti Rigolon. Revisão de decisões automatizadas na Lei Geral de Proteção de Dados Pessoais. 2022. 329 f. Tese (Doutorado em Direito) - Faculdade de Direito, Universidade do Estado do Rio de Janeiro, Rio de Janeiro, 2022.
url http://www.bdtd.uerj.br/handle/1/18276
dc.language.iso.fl_str_mv por
language por
dc.rights.driver.fl_str_mv info:eu-repo/semantics/embargoedAccess
eu_rights_str_mv embargoedAccess
dc.format.none.fl_str_mv application/pdf
dc.publisher.none.fl_str_mv Universidade do Estado do Rio de Janeiro
dc.publisher.program.fl_str_mv Programa de Pós-Graduação em Direito
dc.publisher.initials.fl_str_mv UERJ
dc.publisher.country.fl_str_mv Brasil
dc.publisher.department.fl_str_mv Centro de Ciências Sociais::Faculdade de Direito
publisher.none.fl_str_mv Universidade do Estado do Rio de Janeiro
dc.source.none.fl_str_mv reponame:Biblioteca Digital de Teses e Dissertações da UERJ
instname:Universidade do Estado do Rio de Janeiro (UERJ)
instacron:UERJ
instname_str Universidade do Estado do Rio de Janeiro (UERJ)
instacron_str UERJ
institution UERJ
reponame_str Biblioteca Digital de Teses e Dissertações da UERJ
collection Biblioteca Digital de Teses e Dissertações da UERJ
bitstream.url.fl_str_mv http://www.bdtd.uerj.br/bitstream/1/18276/2/Tese+-+Maria+Regina+Detoni+Cavalcanti+Rigolon+Korkmaz+-+2022+-+Parcial.pdf
http://www.bdtd.uerj.br/bitstream/1/18276/3/Tese+-+Maria+Regina+Detoni+Cavalcanti+Rigolon+Korkmaz+-+2022+-+Completa.pdf
http://www.bdtd.uerj.br/bitstream/1/18276/1/license.txt
bitstream.checksum.fl_str_mv fbc85fc449830a7eba7ee1ad6ac9dd7b
f0a623c5643bb2c1a6f92a97bb925566
e5502652da718045d7fcd832b79fca29
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
repository.name.fl_str_mv Biblioteca Digital de Teses e Dissertações da UERJ - Universidade do Estado do Rio de Janeiro (UERJ)
repository.mail.fl_str_mv bdtd.suporte@uerj.br
_version_ 1811728716606210048