Audit AI- Co-designing Web Platform for External Algorithm Auditing
Autor(a) principal: | |
---|---|
Data de Publicação: | 2022 |
Tipo de documento: | Dissertação |
Idioma: | eng |
Título da fonte: | Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) |
Texto Completo: | http://hdl.handle.net/10362/138796 |
Resumo: | Artificial Intelligence (AI) algorithms are becoming more crucial and decisive every day as the promise that these systems are able to mitigate human biases intensifies, thus pro- viding fair opportunities to everyone. When unchecked, algorithms might be subjected to bias and discriminatory outcomes, principally due to the use of biased datasets or even by defective implementations of the algorithms. Notwithstanding, there has not been a follow-up regarding the monitorization of the algorithmic systems, despite the increas- ing tendency in the use of these systems during decision-making processes. In order to increase the trustworthiness of these algorithms, auditing practices are starting to gain some relevance among companies, researchers, and developers. Nonetheless, the amount of auditing practices performed is still nowhere close the amount needed. Algorithmic auditing practices can prevent abuses by unethical stakeholders, provid- ing greater scrutiny to the algorithms by searching for discrimination and bias. However, ensuring that AI algorithms reflect societal values and do not present any severe case of bias and discrimination is not easy nor simple, considering that most of these algorithms are confidential and not easily accessible. The owners of the algorithms choose not to share their algorithm’s data to protect their intellectual propriety, which often took years to create. This lack of access to the algorithms’ source code, datasets, and results does not allow the auditor to inspect the algorithm entirely, thus making the auditing task far more difficult. This work aims to support the auditing of algorithms without accessing their source code by proposing a solution based on a web platform. The goal is to allow developers to connect their algorithms with the platform, which the auditors can then test using their datasets. So, it is possible to maintain the ”invisibility” of the algorithms’ source code, but at the same time, allow them to be audited. The platform was built based on interviews with developers, auditors, and researchers working in this space, which allowed us to understand and design an adequate solution that fulfilled the user’s requisites. |
id |
RCAP_66d38bd98479b649960f7531000121a4 |
---|---|
oai_identifier_str |
oai:run.unl.pt:10362/138796 |
network_acronym_str |
RCAP |
network_name_str |
Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) |
repository_id_str |
7160 |
spelling |
Audit AI- Co-designing Web Platform for External Algorithm AuditingAuditingArtificial IntelligenceMachine LearningAlgorithmsDomínio/Área Científica::Engenharia e Tecnologia::Engenharia Eletrotécnica, Eletrónica e InformáticaArtificial Intelligence (AI) algorithms are becoming more crucial and decisive every day as the promise that these systems are able to mitigate human biases intensifies, thus pro- viding fair opportunities to everyone. When unchecked, algorithms might be subjected to bias and discriminatory outcomes, principally due to the use of biased datasets or even by defective implementations of the algorithms. Notwithstanding, there has not been a follow-up regarding the monitorization of the algorithmic systems, despite the increas- ing tendency in the use of these systems during decision-making processes. In order to increase the trustworthiness of these algorithms, auditing practices are starting to gain some relevance among companies, researchers, and developers. Nonetheless, the amount of auditing practices performed is still nowhere close the amount needed. Algorithmic auditing practices can prevent abuses by unethical stakeholders, provid- ing greater scrutiny to the algorithms by searching for discrimination and bias. However, ensuring that AI algorithms reflect societal values and do not present any severe case of bias and discrimination is not easy nor simple, considering that most of these algorithms are confidential and not easily accessible. The owners of the algorithms choose not to share their algorithm’s data to protect their intellectual propriety, which often took years to create. This lack of access to the algorithms’ source code, datasets, and results does not allow the auditor to inspect the algorithm entirely, thus making the auditing task far more difficult. This work aims to support the auditing of algorithms without accessing their source code by proposing a solution based on a web platform. The goal is to allow developers to connect their algorithms with the platform, which the auditors can then test using their datasets. So, it is possible to maintain the ”invisibility” of the algorithms’ source code, but at the same time, allow them to be audited. The platform was built based on interviews with developers, auditors, and researchers working in this space, which allowed us to understand and design an adequate solution that fulfilled the user’s requisites.Os algoritmos de Inteligência Artificial desempenham cada vez mais um papel crucial e decisivo na nossa sociedade, existindo a perspetiva de que através da sua utilização seja possível reduzir a influência do preconceito do ser humano durante os processos de decisão, permitindo assim, maior igualdade de oportunidades para todos os envolvidos. No entanto, se auditar algoritmos fosse uma prática comum, verificar-se-ia regularmente que estes algoritmos também apresentam algum grau de discriminação e preconceito, de- vido à utilização de conjuntos de dados enviesados, ou simplesmente por implementações defeituosas dos próprios algoritmos. Não obstante, na prática, não existe uma verificação adequada dos algoritmos, quer por desconhecimento relativamente à prática de auditoria dos mesmos, quer por incapacidade e dificuldade em realizá-la. Através da auditoria de algoritmos é possível precaver abusos de poder de certos in- tervenientes, bem como garantir um maior escrutínio no que diz respeito a situações de discriminação ou desigualdade. Realizar auditorias de algoritmos é uma tarefa extrema- mente complexa e exigente visto que a maior parte dos algoritmos são confidenciais, ou seja, o seu código fonte e datasets não estão disponíveis ao público. Os responsáveis pelos algoritmos optam por tomar estas medidas com o intuito de proteger a sua propriedade intelectual, no entanto, a inacessibilidade a estas informações complica o trabalho dos auditores, uma vez que estes não conseguem desta forma analisar por completo o sistema. Seguindo esta ordem de ideias a nossa proposta de solução consiste na criação de um protótipo de uma plataforma Web, onde é possível aos programadores introduzir as informações dos seus algoritmos, incluindo os endpoints para auditá-los, mantendo assim a ”invisibilidade” dos mesmos, mas ao mesmo tempo, permitindo que estes sejam auditados. O protótipo foi desenvolvido com base no feedback e ideias dos auditores e programadores, que foram os principais utilizadores do sistema.Nunes, FranciscoCavaco, SofiaRUNFernandes, Rui Marques2022-05-27T15:46:07Z2022-012022-01-01T00:00:00Zinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/masterThesisapplication/pdfhttp://hdl.handle.net/10362/138796enginfo:eu-repo/semantics/openAccessreponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãoinstacron:RCAAP2024-03-11T05:16:11Zoai:run.unl.pt:10362/138796Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireopendoar:71602024-03-20T03:49:13.540456Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãofalse |
dc.title.none.fl_str_mv |
Audit AI- Co-designing Web Platform for External Algorithm Auditing |
title |
Audit AI- Co-designing Web Platform for External Algorithm Auditing |
spellingShingle |
Audit AI- Co-designing Web Platform for External Algorithm Auditing Fernandes, Rui Marques Auditing Artificial Intelligence Machine Learning Algorithms Domínio/Área Científica::Engenharia e Tecnologia::Engenharia Eletrotécnica, Eletrónica e Informática |
title_short |
Audit AI- Co-designing Web Platform for External Algorithm Auditing |
title_full |
Audit AI- Co-designing Web Platform for External Algorithm Auditing |
title_fullStr |
Audit AI- Co-designing Web Platform for External Algorithm Auditing |
title_full_unstemmed |
Audit AI- Co-designing Web Platform for External Algorithm Auditing |
title_sort |
Audit AI- Co-designing Web Platform for External Algorithm Auditing |
author |
Fernandes, Rui Marques |
author_facet |
Fernandes, Rui Marques |
author_role |
author |
dc.contributor.none.fl_str_mv |
Nunes, Francisco Cavaco, Sofia RUN |
dc.contributor.author.fl_str_mv |
Fernandes, Rui Marques |
dc.subject.por.fl_str_mv |
Auditing Artificial Intelligence Machine Learning Algorithms Domínio/Área Científica::Engenharia e Tecnologia::Engenharia Eletrotécnica, Eletrónica e Informática |
topic |
Auditing Artificial Intelligence Machine Learning Algorithms Domínio/Área Científica::Engenharia e Tecnologia::Engenharia Eletrotécnica, Eletrónica e Informática |
description |
Artificial Intelligence (AI) algorithms are becoming more crucial and decisive every day as the promise that these systems are able to mitigate human biases intensifies, thus pro- viding fair opportunities to everyone. When unchecked, algorithms might be subjected to bias and discriminatory outcomes, principally due to the use of biased datasets or even by defective implementations of the algorithms. Notwithstanding, there has not been a follow-up regarding the monitorization of the algorithmic systems, despite the increas- ing tendency in the use of these systems during decision-making processes. In order to increase the trustworthiness of these algorithms, auditing practices are starting to gain some relevance among companies, researchers, and developers. Nonetheless, the amount of auditing practices performed is still nowhere close the amount needed. Algorithmic auditing practices can prevent abuses by unethical stakeholders, provid- ing greater scrutiny to the algorithms by searching for discrimination and bias. However, ensuring that AI algorithms reflect societal values and do not present any severe case of bias and discrimination is not easy nor simple, considering that most of these algorithms are confidential and not easily accessible. The owners of the algorithms choose not to share their algorithm’s data to protect their intellectual propriety, which often took years to create. This lack of access to the algorithms’ source code, datasets, and results does not allow the auditor to inspect the algorithm entirely, thus making the auditing task far more difficult. This work aims to support the auditing of algorithms without accessing their source code by proposing a solution based on a web platform. The goal is to allow developers to connect their algorithms with the platform, which the auditors can then test using their datasets. So, it is possible to maintain the ”invisibility” of the algorithms’ source code, but at the same time, allow them to be audited. The platform was built based on interviews with developers, auditors, and researchers working in this space, which allowed us to understand and design an adequate solution that fulfilled the user’s requisites. |
publishDate |
2022 |
dc.date.none.fl_str_mv |
2022-05-27T15:46:07Z 2022-01 2022-01-01T00:00:00Z |
dc.type.status.fl_str_mv |
info:eu-repo/semantics/publishedVersion |
dc.type.driver.fl_str_mv |
info:eu-repo/semantics/masterThesis |
format |
masterThesis |
status_str |
publishedVersion |
dc.identifier.uri.fl_str_mv |
http://hdl.handle.net/10362/138796 |
url |
http://hdl.handle.net/10362/138796 |
dc.language.iso.fl_str_mv |
eng |
language |
eng |
dc.rights.driver.fl_str_mv |
info:eu-repo/semantics/openAccess |
eu_rights_str_mv |
openAccess |
dc.format.none.fl_str_mv |
application/pdf |
dc.source.none.fl_str_mv |
reponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação instacron:RCAAP |
instname_str |
Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação |
instacron_str |
RCAAP |
institution |
RCAAP |
reponame_str |
Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) |
collection |
Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) |
repository.name.fl_str_mv |
Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação |
repository.mail.fl_str_mv |
|
_version_ |
1799138091538055168 |