Ocular Recognition Using Deep Features for Identity Authentication

Detalhes bibliográficos
Autor(a) principal: Vizoni, Marcelo V. [UNESP]
Data de Publicação: 2020
Outros Autores: Marana, Aparecido N. [UNESP], Paiva, A. C., Conci, A., Braz, G., Almeida, JDS, Fernandes, LAF
Tipo de documento: Artigo de conferência
Idioma: eng
Título da fonte: Repositório Institucional da UNESP
Texto Completo: http://hdl.handle.net/11449/209188
Resumo: Recently, ocular biometrics has been gaining importance in Biometrics due to the poor performance obtained in some cases by biometric systems based on characteristics of the whole face. This paper presents a new method for person authentication based on ocular deep features, which are extracted from the ocular region of the face by using a very deep CNN (Convolutional Neural Network). Another interesting aspect of our method is that, instead of using directly the deep features as input for the authentication system, it uses the difference between the probe and gallery deep features. So, our method adopts a pairwise strategy. A binary support vector machine is trained to determine whether a given difference vector is genuine or impostor. The proposed new method based on such pairwise strategy was evaluated using the ocular left set of the UBIPr dataset and five pre-trained CNN architectures. When using the pre-trained VGG-Face the proposed method obtained a state-of-the-art result (3.18% of Equal Error Rate).
id UNSP_fd6c6b2e42449988d3f22f506917a37e
oai_identifier_str oai:repositorio.unesp.br:11449/209188
network_acronym_str UNSP
network_name_str Repositório Institucional da UNESP
repository_id_str 2946
spelling Ocular Recognition Using Deep Features for Identity Authenticationocular biometricsdeep learningconvolutional neural networksperson authenticationRecently, ocular biometrics has been gaining importance in Biometrics due to the poor performance obtained in some cases by biometric systems based on characteristics of the whole face. This paper presents a new method for person authentication based on ocular deep features, which are extracted from the ocular region of the face by using a very deep CNN (Convolutional Neural Network). Another interesting aspect of our method is that, instead of using directly the deep features as input for the authentication system, it uses the difference between the probe and gallery deep features. So, our method adopts a pairwise strategy. A binary support vector machine is trained to determine whether a given difference vector is genuine or impostor. The proposed new method based on such pairwise strategy was evaluated using the ocular left set of the UBIPr dataset and five pre-trained CNN architectures. When using the pre-trained VGG-Face the proposed method obtained a state-of-the-art result (3.18% of Equal Error Rate).Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)NVIDIA Corporation (GPU Grant Program)Sao Paulo State Univ UNESP, Bauru, SP, BrazilSao Paulo State Univ UNESP, Bauru, SP, BrazilCAPES: 001IeeeUniversidade Estadual Paulista (Unesp)Vizoni, Marcelo V. [UNESP]Marana, Aparecido N. [UNESP]Paiva, A. C.Conci, A.Braz, G.Almeida, JDSFernandes, LAF2021-06-25T11:50:56Z2021-06-25T11:50:56Z2020-01-01info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/conferenceObject155-160Proceedings Of The 2020 International Conference On Systems, Signals And Image Processing (iwssip), 27th Edition. New York: Ieee, p. 155-160, 2020.2157-8672http://hdl.handle.net/11449/209188WOS:000615731300028Web of Sciencereponame:Repositório Institucional da UNESPinstname:Universidade Estadual Paulista (UNESP)instacron:UNESPengProceedings Of The 2020 International Conference On Systems, Signals And Image Processing (iwssip), 27th Editioninfo:eu-repo/semantics/openAccess2021-10-23T19:23:37Zoai:repositorio.unesp.br:11449/209188Repositório InstitucionalPUBhttp://repositorio.unesp.br/oai/requestopendoar:29462024-08-05T16:59:53.323747Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP)false
dc.title.none.fl_str_mv Ocular Recognition Using Deep Features for Identity Authentication
title Ocular Recognition Using Deep Features for Identity Authentication
spellingShingle Ocular Recognition Using Deep Features for Identity Authentication
Vizoni, Marcelo V. [UNESP]
ocular biometrics
deep learning
convolutional neural networks
person authentication
title_short Ocular Recognition Using Deep Features for Identity Authentication
title_full Ocular Recognition Using Deep Features for Identity Authentication
title_fullStr Ocular Recognition Using Deep Features for Identity Authentication
title_full_unstemmed Ocular Recognition Using Deep Features for Identity Authentication
title_sort Ocular Recognition Using Deep Features for Identity Authentication
author Vizoni, Marcelo V. [UNESP]
author_facet Vizoni, Marcelo V. [UNESP]
Marana, Aparecido N. [UNESP]
Paiva, A. C.
Conci, A.
Braz, G.
Almeida, JDS
Fernandes, LAF
author_role author
author2 Marana, Aparecido N. [UNESP]
Paiva, A. C.
Conci, A.
Braz, G.
Almeida, JDS
Fernandes, LAF
author2_role author
author
author
author
author
author
dc.contributor.none.fl_str_mv Universidade Estadual Paulista (Unesp)
dc.contributor.author.fl_str_mv Vizoni, Marcelo V. [UNESP]
Marana, Aparecido N. [UNESP]
Paiva, A. C.
Conci, A.
Braz, G.
Almeida, JDS
Fernandes, LAF
dc.subject.por.fl_str_mv ocular biometrics
deep learning
convolutional neural networks
person authentication
topic ocular biometrics
deep learning
convolutional neural networks
person authentication
description Recently, ocular biometrics has been gaining importance in Biometrics due to the poor performance obtained in some cases by biometric systems based on characteristics of the whole face. This paper presents a new method for person authentication based on ocular deep features, which are extracted from the ocular region of the face by using a very deep CNN (Convolutional Neural Network). Another interesting aspect of our method is that, instead of using directly the deep features as input for the authentication system, it uses the difference between the probe and gallery deep features. So, our method adopts a pairwise strategy. A binary support vector machine is trained to determine whether a given difference vector is genuine or impostor. The proposed new method based on such pairwise strategy was evaluated using the ocular left set of the UBIPr dataset and five pre-trained CNN architectures. When using the pre-trained VGG-Face the proposed method obtained a state-of-the-art result (3.18% of Equal Error Rate).
publishDate 2020
dc.date.none.fl_str_mv 2020-01-01
2021-06-25T11:50:56Z
2021-06-25T11:50:56Z
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/conferenceObject
format conferenceObject
status_str publishedVersion
dc.identifier.uri.fl_str_mv Proceedings Of The 2020 International Conference On Systems, Signals And Image Processing (iwssip), 27th Edition. New York: Ieee, p. 155-160, 2020.
2157-8672
http://hdl.handle.net/11449/209188
WOS:000615731300028
identifier_str_mv Proceedings Of The 2020 International Conference On Systems, Signals And Image Processing (iwssip), 27th Edition. New York: Ieee, p. 155-160, 2020.
2157-8672
WOS:000615731300028
url http://hdl.handle.net/11449/209188
dc.language.iso.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv Proceedings Of The 2020 International Conference On Systems, Signals And Image Processing (iwssip), 27th Edition
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv 155-160
dc.publisher.none.fl_str_mv Ieee
publisher.none.fl_str_mv Ieee
dc.source.none.fl_str_mv Web of Science
reponame:Repositório Institucional da UNESP
instname:Universidade Estadual Paulista (UNESP)
instacron:UNESP
instname_str Universidade Estadual Paulista (UNESP)
instacron_str UNESP
institution UNESP
reponame_str Repositório Institucional da UNESP
collection Repositório Institucional da UNESP
repository.name.fl_str_mv Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP)
repository.mail.fl_str_mv
_version_ 1808128733600546816