Simple and efficient methods for gait recognition using pose information

Detalhes bibliográficos
Autor(a) principal: Vítor Cézar de Lima
Data de Publicação: 2021
Tipo de documento: Dissertação
Idioma: eng
Título da fonte: Repositório Institucional da UFMG
Texto Completo: http://hdl.handle.net/1843/38879
https://orcid.org/ 0000-0002-7552-1957
Resumo: Gait is a biometry that differentiates individuals by their walking manner. Research on this topic has gained evidence since it is unobtrusive and available at distances, which is desirable in surveillance scenarios. Most of the previous works have focused on the human silhouette as representation; however, they suffer from many factors such as movement on scene, clothing and carrying conditions. To avoid such problems, this work employs a pose estimation method, called PoseDist, to retrieve the coordinates of body parts and transform them into signals and movement histograms. These features are processed using a fusion of Subsequence Dynamic Time Warping and Euclidean distance to compare gait sequences from the probe with those in the gallery. This method is evaluated on all views of CASIA Dataset A and compared to existing ones, demonstrating its efficacy. However, as its algorithmic cost is high, it is only suitable for environments with few individuals; and this way, a new method called PoseFrame is employed for gait recognition, training a multilayer perception to classify poses from individual frames and aggregating its results by majority voting. PoseFrame is tested on CASIA Dataset A, having accuracy above other model-based works, including PoseDist; and on CASIA Dataset B, achieving state-of-the-art accuracy on same-view condition and having some of the best results on cross-view. Finally, an ablation study is also performed to find which body parts are the most important for gait recognition and according to the findings, the arms and feet are the most important locations.
id UFMG_3b381495a688db690b992e91253ba8d7
oai_identifier_str oai:repositorio.ufmg.br:1843/38879
network_acronym_str UFMG
network_name_str Repositório Institucional da UFMG
repository_id_str
spelling William Robson Schwartzhttp://lattes.cnpq.br/0704592200063682Guillermo Cámara ChávezAparecido Nilceu Maranahttp://lattes.cnpq.br/8129396896162206Vítor Cézar de Lima2021-12-17T19:53:40Z2021-12-17T19:53:40Z2021-07-26http://hdl.handle.net/1843/38879https://orcid.org/ 0000-0002-7552-1957Gait is a biometry that differentiates individuals by their walking manner. Research on this topic has gained evidence since it is unobtrusive and available at distances, which is desirable in surveillance scenarios. Most of the previous works have focused on the human silhouette as representation; however, they suffer from many factors such as movement on scene, clothing and carrying conditions. To avoid such problems, this work employs a pose estimation method, called PoseDist, to retrieve the coordinates of body parts and transform them into signals and movement histograms. These features are processed using a fusion of Subsequence Dynamic Time Warping and Euclidean distance to compare gait sequences from the probe with those in the gallery. This method is evaluated on all views of CASIA Dataset A and compared to existing ones, demonstrating its efficacy. However, as its algorithmic cost is high, it is only suitable for environments with few individuals; and this way, a new method called PoseFrame is employed for gait recognition, training a multilayer perception to classify poses from individual frames and aggregating its results by majority voting. PoseFrame is tested on CASIA Dataset A, having accuracy above other model-based works, including PoseDist; and on CASIA Dataset B, achieving state-of-the-art accuracy on same-view condition and having some of the best results on cross-view. Finally, an ablation study is also performed to find which body parts are the most important for gait recognition and according to the findings, the arms and feet are the most important locations.Gait é um tipo de biometria que diferencia os indivíduos pela forma como andam. Pesquisas relacionadas a essa biometria estão ganhando evidência devido à vantagem de gait ser discreto e poder ser capturado a distância, o que é desejável em cenários de vigilância. A maioria dos trabalhos da literatura foca em usar silhueta humana como representação de gait; no entanto, elas sofrem de diversos fatores, como movimento de pessoas nas cenas, condições de carga e uso de roupas diferentes. Para evitar esses problemas, esse trabalho propõe um método de estimativa de pose, denominado PoseDist, para recuperar coordenadas de articulações e transformá-las em sinais e histogramas de movimento. Depois disso, essas informações são processadas usando uma fusão de Subsequence Dynamic Time Warping e distância euclidiana para comparar as sequências de gait da consulta com as da galeria. Esse método é avaliado em todas as visualizações de CASIA Dataset A e comparado com trabalhos existentes, demonstrando sua eficácia. No entanto, como seu custo algorítmico é alto, ele só é adequado para ambientes com poucos indivíduos; e dessa forma, um novo método denominado PoseFrame é desenvolvido para reconhecimento de gait, treinando uma rede neural multicamadas para classificar as poses a partir de quadros individuais e agregando os resultados por votação majoritária. PoseFrame é testado em CASIA Dataset A, tendo precisão acima dos outros trabalhos baseados em modelo, incluindo PoseDist; e em CASIA Dataset B, alcançando precisão estado-da-arte quando a amostra tem a mesma visualização da galeria e tendo alguns dos melhores resultados em validação cruzada. Finalmente, um estudo de ablação também é realizado para descobrir quais partes do corpo são as mais importantes para reconhecimento de gait e de acordo com os resultados, os braços e os pés são as localizações mais importantes.CNPq - Conselho Nacional de Desenvolvimento Científico e TecnológicoFAPEMIG - Fundação de Amparo à Pesquisa do Estado de Minas GeraisCAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível SuperiorengUniversidade Federal de Minas GeraisPrograma de Pós-Graduação em Ciência da ComputaçãoUFMGBrasilICX - DEPARTAMENTO DE CIÊNCIA DA COMPUTAÇÃOhttp://creativecommons.org/licenses/by-nc/3.0/pt/info:eu-repo/semantics/openAccessComputação – TesesVisão por computador - TesesBiometria – TesesReconhecimento de padrões (Computadores) – TesesGait recognitionBiometryComputer visionSimple and efficient methods for gait recognition using pose informationMétodos eficientes e simples para reconhecimento de gait usando informação de poseinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/masterThesisreponame:Repositório Institucional da UFMGinstname:Universidade Federal de Minas Gerais (UFMG)instacron:UFMGORIGINALtese.pdftese.pdfapplication/pdf3738568https://repositorio.ufmg.br/bitstream/1843/38879/1/tese.pdf3c6da3aaef82dbe45069381193d1a8daMD51CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8920https://repositorio.ufmg.br/bitstream/1843/38879/2/license_rdf33b8016dc5c4681c1e7a582a4161162cMD52LICENSElicense.txtlicense.txttext/plain; charset=utf-82118https://repositorio.ufmg.br/bitstream/1843/38879/3/license.txtcda590c95a0b51b4d15f60c9642ca272MD531843/388792021-12-17 16:53:40.573oai:repositorio.ufmg.br:1843/38879TElDRU7Dh0EgREUgRElTVFJJQlVJw4fDg08gTsODTy1FWENMVVNJVkEgRE8gUkVQT1NJVMOTUklPIElOU1RJVFVDSU9OQUwgREEgVUZNRwoKQ29tIGEgYXByZXNlbnRhw6fDo28gZGVzdGEgbGljZW7Dp2EsIHZvY8OqIChvIGF1dG9yIChlcykgb3UgbyB0aXR1bGFyIGRvcyBkaXJlaXRvcyBkZSBhdXRvcikgY29uY2VkZSBhbyBSZXBvc2l0w7NyaW8gSW5zdGl0dWNpb25hbCBkYSBVRk1HIChSSS1VRk1HKSBvIGRpcmVpdG8gbsOjbyBleGNsdXNpdm8gZSBpcnJldm9nw6F2ZWwgZGUgcmVwcm9kdXppciBlL291IGRpc3RyaWJ1aXIgYSBzdWEgcHVibGljYcOnw6NvIChpbmNsdWluZG8gbyByZXN1bW8pIHBvciB0b2RvIG8gbXVuZG8gbm8gZm9ybWF0byBpbXByZXNzbyBlIGVsZXRyw7RuaWNvIGUgZW0gcXVhbHF1ZXIgbWVpbywgaW5jbHVpbmRvIG9zIGZvcm1hdG9zIMOhdWRpbyBvdSB2w61kZW8uCgpWb2PDqiBkZWNsYXJhIHF1ZSBjb25oZWNlIGEgcG9sw610aWNhIGRlIGNvcHlyaWdodCBkYSBlZGl0b3JhIGRvIHNldSBkb2N1bWVudG8gZSBxdWUgY29uaGVjZSBlIGFjZWl0YSBhcyBEaXJldHJpemVzIGRvIFJJLVVGTUcuCgpWb2PDqiBjb25jb3JkYSBxdWUgbyBSZXBvc2l0w7NyaW8gSW5zdGl0dWNpb25hbCBkYSBVRk1HIHBvZGUsIHNlbSBhbHRlcmFyIG8gY29udGXDumRvLCB0cmFuc3BvciBhIHN1YSBwdWJsaWNhw6fDo28gcGFyYSBxdWFscXVlciBtZWlvIG91IGZvcm1hdG8gcGFyYSBmaW5zIGRlIHByZXNlcnZhw6fDo28uCgpWb2PDqiB0YW1iw6ltIGNvbmNvcmRhIHF1ZSBvIFJlcG9zaXTDs3JpbyBJbnN0aXR1Y2lvbmFsIGRhIFVGTUcgcG9kZSBtYW50ZXIgbWFpcyBkZSB1bWEgY8OzcGlhIGRlIHN1YSBwdWJsaWNhw6fDo28gcGFyYSBmaW5zIGRlIHNlZ3VyYW7Dp2EsIGJhY2stdXAgZSBwcmVzZXJ2YcOnw6NvLgoKVm9jw6ogZGVjbGFyYSBxdWUgYSBzdWEgcHVibGljYcOnw6NvIMOpIG9yaWdpbmFsIGUgcXVlIHZvY8OqIHRlbSBvIHBvZGVyIGRlIGNvbmNlZGVyIG9zIGRpcmVpdG9zIGNvbnRpZG9zIG5lc3RhIGxpY2Vuw6dhLiBWb2PDqiB0YW1iw6ltIGRlY2xhcmEgcXVlIG8gZGVww7NzaXRvIGRlIHN1YSBwdWJsaWNhw6fDo28gbsOjbywgcXVlIHNlamEgZGUgc2V1IGNvbmhlY2ltZW50bywgaW5mcmluZ2UgZGlyZWl0b3MgYXV0b3JhaXMgZGUgbmluZ3XDqW0uCgpDYXNvIGEgc3VhIHB1YmxpY2HDp8OjbyBjb250ZW5oYSBtYXRlcmlhbCBxdWUgdm9jw6ogbsOjbyBwb3NzdWkgYSB0aXR1bGFyaWRhZGUgZG9zIGRpcmVpdG9zIGF1dG9yYWlzLCB2b2PDqiBkZWNsYXJhIHF1ZSBvYnRldmUgYSBwZXJtaXNzw6NvIGlycmVzdHJpdGEgZG8gZGV0ZW50b3IgZG9zIGRpcmVpdG9zIGF1dG9yYWlzIHBhcmEgY29uY2VkZXIgYW8gUmVwb3NpdMOzcmlvIEluc3RpdHVjaW9uYWwgZGEgVUZNRyBvcyBkaXJlaXRvcyBhcHJlc2VudGFkb3MgbmVzdGEgbGljZW7Dp2EsIGUgcXVlIGVzc2UgbWF0ZXJpYWwgZGUgcHJvcHJpZWRhZGUgZGUgdGVyY2Vpcm9zIGVzdMOhIGNsYXJhbWVudGUgaWRlbnRpZmljYWRvIGUgcmVjb25oZWNpZG8gbm8gdGV4dG8gb3Ugbm8gY29udGXDumRvIGRhIHB1YmxpY2HDp8OjbyBvcmEgZGVwb3NpdGFkYS4KCkNBU08gQSBQVUJMSUNBw4fDg08gT1JBIERFUE9TSVRBREEgVEVOSEEgU0lETyBSRVNVTFRBRE8gREUgVU0gUEFUUk9Dw41OSU8gT1UgQVBPSU8gREUgVU1BIEFHw4pOQ0lBIERFIEZPTUVOVE8gT1UgT1VUUk8gT1JHQU5JU01PLCBWT0PDiiBERUNMQVJBIFFVRSBSRVNQRUlUT1UgVE9ET1MgRSBRVUFJU1FVRVIgRElSRUlUT1MgREUgUkVWSVPDg08gQ09NTyBUQU1Cw4lNIEFTIERFTUFJUyBPQlJJR0HDh8OVRVMgRVhJR0lEQVMgUE9SIENPTlRSQVRPIE9VIEFDT1JETy4KCk8gUmVwb3NpdMOzcmlvIEluc3RpdHVjaW9uYWwgZGEgVUZNRyBzZSBjb21wcm9tZXRlIGEgaWRlbnRpZmljYXIgY2xhcmFtZW50ZSBvIHNldSBub21lKHMpIG91IG8ocykgbm9tZXMocykgZG8ocykgZGV0ZW50b3IoZXMpIGRvcyBkaXJlaXRvcyBhdXRvcmFpcyBkYSBwdWJsaWNhw6fDo28sIGUgbsOjbyBmYXLDoSBxdWFscXVlciBhbHRlcmHDp8OjbywgYWzDqW0gZGFxdWVsYXMgY29uY2VkaWRhcyBwb3IgZXN0YSBsaWNlbsOnYS4KRepositório de PublicaçõesPUBhttps://repositorio.ufmg.br/oaiopendoar:2021-12-17T19:53:40Repositório Institucional da UFMG - Universidade Federal de Minas Gerais (UFMG)false
dc.title.pt_BR.fl_str_mv Simple and efficient methods for gait recognition using pose information
dc.title.alternative.pt_BR.fl_str_mv Métodos eficientes e simples para reconhecimento de gait usando informação de pose
title Simple and efficient methods for gait recognition using pose information
spellingShingle Simple and efficient methods for gait recognition using pose information
Vítor Cézar de Lima
Gait recognition
Biometry
Computer vision
Computação – Teses
Visão por computador - Teses
Biometria – Teses
Reconhecimento de padrões (Computadores) – Teses
title_short Simple and efficient methods for gait recognition using pose information
title_full Simple and efficient methods for gait recognition using pose information
title_fullStr Simple and efficient methods for gait recognition using pose information
title_full_unstemmed Simple and efficient methods for gait recognition using pose information
title_sort Simple and efficient methods for gait recognition using pose information
author Vítor Cézar de Lima
author_facet Vítor Cézar de Lima
author_role author
dc.contributor.advisor1.fl_str_mv William Robson Schwartz
dc.contributor.advisor1Lattes.fl_str_mv http://lattes.cnpq.br/0704592200063682
dc.contributor.referee1.fl_str_mv Guillermo Cámara Chávez
dc.contributor.referee2.fl_str_mv Aparecido Nilceu Marana
dc.contributor.authorLattes.fl_str_mv http://lattes.cnpq.br/8129396896162206
dc.contributor.author.fl_str_mv Vítor Cézar de Lima
contributor_str_mv William Robson Schwartz
Guillermo Cámara Chávez
Aparecido Nilceu Marana
dc.subject.por.fl_str_mv Gait recognition
Biometry
Computer vision
topic Gait recognition
Biometry
Computer vision
Computação – Teses
Visão por computador - Teses
Biometria – Teses
Reconhecimento de padrões (Computadores) – Teses
dc.subject.other.pt_BR.fl_str_mv Computação – Teses
Visão por computador - Teses
Biometria – Teses
Reconhecimento de padrões (Computadores) – Teses
description Gait is a biometry that differentiates individuals by their walking manner. Research on this topic has gained evidence since it is unobtrusive and available at distances, which is desirable in surveillance scenarios. Most of the previous works have focused on the human silhouette as representation; however, they suffer from many factors such as movement on scene, clothing and carrying conditions. To avoid such problems, this work employs a pose estimation method, called PoseDist, to retrieve the coordinates of body parts and transform them into signals and movement histograms. These features are processed using a fusion of Subsequence Dynamic Time Warping and Euclidean distance to compare gait sequences from the probe with those in the gallery. This method is evaluated on all views of CASIA Dataset A and compared to existing ones, demonstrating its efficacy. However, as its algorithmic cost is high, it is only suitable for environments with few individuals; and this way, a new method called PoseFrame is employed for gait recognition, training a multilayer perception to classify poses from individual frames and aggregating its results by majority voting. PoseFrame is tested on CASIA Dataset A, having accuracy above other model-based works, including PoseDist; and on CASIA Dataset B, achieving state-of-the-art accuracy on same-view condition and having some of the best results on cross-view. Finally, an ablation study is also performed to find which body parts are the most important for gait recognition and according to the findings, the arms and feet are the most important locations.
publishDate 2021
dc.date.accessioned.fl_str_mv 2021-12-17T19:53:40Z
dc.date.available.fl_str_mv 2021-12-17T19:53:40Z
dc.date.issued.fl_str_mv 2021-07-26
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/masterThesis
format masterThesis
status_str publishedVersion
dc.identifier.uri.fl_str_mv http://hdl.handle.net/1843/38879
dc.identifier.orcid.pt_BR.fl_str_mv https://orcid.org/ 0000-0002-7552-1957
url http://hdl.handle.net/1843/38879
https://orcid.org/ 0000-0002-7552-1957
dc.language.iso.fl_str_mv eng
language eng
dc.rights.driver.fl_str_mv http://creativecommons.org/licenses/by-nc/3.0/pt/
info:eu-repo/semantics/openAccess
rights_invalid_str_mv http://creativecommons.org/licenses/by-nc/3.0/pt/
eu_rights_str_mv openAccess
dc.publisher.none.fl_str_mv Universidade Federal de Minas Gerais
dc.publisher.program.fl_str_mv Programa de Pós-Graduação em Ciência da Computação
dc.publisher.initials.fl_str_mv UFMG
dc.publisher.country.fl_str_mv Brasil
dc.publisher.department.fl_str_mv ICX - DEPARTAMENTO DE CIÊNCIA DA COMPUTAÇÃO
publisher.none.fl_str_mv Universidade Federal de Minas Gerais
dc.source.none.fl_str_mv reponame:Repositório Institucional da UFMG
instname:Universidade Federal de Minas Gerais (UFMG)
instacron:UFMG
instname_str Universidade Federal de Minas Gerais (UFMG)
instacron_str UFMG
institution UFMG
reponame_str Repositório Institucional da UFMG
collection Repositório Institucional da UFMG
bitstream.url.fl_str_mv https://repositorio.ufmg.br/bitstream/1843/38879/1/tese.pdf
https://repositorio.ufmg.br/bitstream/1843/38879/2/license_rdf
https://repositorio.ufmg.br/bitstream/1843/38879/3/license.txt
bitstream.checksum.fl_str_mv 3c6da3aaef82dbe45069381193d1a8da
33b8016dc5c4681c1e7a582a4161162c
cda590c95a0b51b4d15f60c9642ca272
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
repository.name.fl_str_mv Repositório Institucional da UFMG - Universidade Federal de Minas Gerais (UFMG)
repository.mail.fl_str_mv
_version_ 1803589572036132864