Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile Devices

Detalhes bibliográficos
Autor(a) principal: Pires, Ivan
Data de Publicação: 2019
Outros Autores: Marques, Gonçalo, Garcia, Nuno M., Pombo, Nuno, Flórez-Revuelta, Francisco, Spinsante, Susanna, Teixeira, Maria Canavarro, Zdravevski, Eftim
Tipo de documento: Artigo
Idioma: eng
Título da fonte: Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
Texto Completo: http://hdl.handle.net/10400.6/8255
Resumo: The identification of Activities of Daily Living (ADL) is intrinsic with the user’s environment recognition. This detection can be executed through standard sensors present in every-day mobile devices. On the one hand, the main proposal is to recognize users’ environment and standing activities. On the other hand, these features are included in a framework for the ADL and environment identification. Therefore, this paper is divided into two parts—firstly, acoustic sensors are used for the collection of data towards the recognition of the environment and, secondly, the information of the environment recognized is fused with the information gathered by motion and magnetic sensors. The environment and ADL recognition are performed by pattern recognition techniques that aim for the development of a system, including data collection, processing, fusion and classification procedures. These classification techniques include distinctive types of Artificial Neural Networks (ANN), analyzing various implementations of ANN and choosing the most suitable for further inclusion in the following different stages of the developed system. The results present 85.89% accuracy using Deep Neural Networks (DNN) with normalized data for the ADL recognition and 86.50% accuracy using Feedforward Neural Networks (FNN) with non-normalized data for environment recognition. Furthermore, the tests conducted present 100% accuracy for standing activities recognition using DNN with normalized data, which is the most suited for the intended purpose.
id RCAP_3449eb6ab8612b2cb7640e4bbad17641
oai_identifier_str oai:ubibliorum.ubi.pt:10400.6/8255
network_acronym_str RCAP
network_name_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository_id_str 7160
spelling Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile DevicesActivities of Daily Living (ADL)Data fusionEnvironmentsFeature extractionPattern recognitionSensorsThe identification of Activities of Daily Living (ADL) is intrinsic with the user’s environment recognition. This detection can be executed through standard sensors present in every-day mobile devices. On the one hand, the main proposal is to recognize users’ environment and standing activities. On the other hand, these features are included in a framework for the ADL and environment identification. Therefore, this paper is divided into two parts—firstly, acoustic sensors are used for the collection of data towards the recognition of the environment and, secondly, the information of the environment recognized is fused with the information gathered by motion and magnetic sensors. The environment and ADL recognition are performed by pattern recognition techniques that aim for the development of a system, including data collection, processing, fusion and classification procedures. These classification techniques include distinctive types of Artificial Neural Networks (ANN), analyzing various implementations of ANN and choosing the most suitable for further inclusion in the following different stages of the developed system. The results present 85.89% accuracy using Deep Neural Networks (DNN) with normalized data for the ADL recognition and 86.50% accuracy using Feedforward Neural Networks (FNN) with non-normalized data for environment recognition. Furthermore, the tests conducted present 100% accuracy for standing activities recognition using DNN with normalized data, which is the most suited for the intended purpose.uBibliorumPires, IvanMarques, GonçaloGarcia, Nuno M.Pombo, NunoFlórez-Revuelta, FranciscoSpinsante, SusannaTeixeira, Maria CanavarroZdravevski, Eftim2020-01-14T14:33:30Z20192019-01-01T00:00:00Zinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/articleapplication/pdfhttp://hdl.handle.net/10400.6/8255eng10.3390/electronics8121499info:eu-repo/semantics/openAccessreponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãoinstacron:RCAAP2023-12-15T09:48:07Zoai:ubibliorum.ubi.pt:10400.6/8255Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireopendoar:71602024-03-20T00:48:37.852911Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãofalse
dc.title.none.fl_str_mv Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile Devices
title Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile Devices
spellingShingle Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile Devices
Pires, Ivan
Activities of Daily Living (ADL)
Data fusion
Environments
Feature extraction
Pattern recognition
Sensors
title_short Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile Devices
title_full Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile Devices
title_fullStr Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile Devices
title_full_unstemmed Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile Devices
title_sort Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile Devices
author Pires, Ivan
author_facet Pires, Ivan
Marques, Gonçalo
Garcia, Nuno M.
Pombo, Nuno
Flórez-Revuelta, Francisco
Spinsante, Susanna
Teixeira, Maria Canavarro
Zdravevski, Eftim
author_role author
author2 Marques, Gonçalo
Garcia, Nuno M.
Pombo, Nuno
Flórez-Revuelta, Francisco
Spinsante, Susanna
Teixeira, Maria Canavarro
Zdravevski, Eftim
author2_role author
author
author
author
author
author
author
dc.contributor.none.fl_str_mv uBibliorum
dc.contributor.author.fl_str_mv Pires, Ivan
Marques, Gonçalo
Garcia, Nuno M.
Pombo, Nuno
Flórez-Revuelta, Francisco
Spinsante, Susanna
Teixeira, Maria Canavarro
Zdravevski, Eftim
dc.subject.por.fl_str_mv Activities of Daily Living (ADL)
Data fusion
Environments
Feature extraction
Pattern recognition
Sensors
topic Activities of Daily Living (ADL)
Data fusion
Environments
Feature extraction
Pattern recognition
Sensors
description The identification of Activities of Daily Living (ADL) is intrinsic with the user’s environment recognition. This detection can be executed through standard sensors present in every-day mobile devices. On the one hand, the main proposal is to recognize users’ environment and standing activities. On the other hand, these features are included in a framework for the ADL and environment identification. Therefore, this paper is divided into two parts—firstly, acoustic sensors are used for the collection of data towards the recognition of the environment and, secondly, the information of the environment recognized is fused with the information gathered by motion and magnetic sensors. The environment and ADL recognition are performed by pattern recognition techniques that aim for the development of a system, including data collection, processing, fusion and classification procedures. These classification techniques include distinctive types of Artificial Neural Networks (ANN), analyzing various implementations of ANN and choosing the most suitable for further inclusion in the following different stages of the developed system. The results present 85.89% accuracy using Deep Neural Networks (DNN) with normalized data for the ADL recognition and 86.50% accuracy using Feedforward Neural Networks (FNN) with non-normalized data for environment recognition. Furthermore, the tests conducted present 100% accuracy for standing activities recognition using DNN with normalized data, which is the most suited for the intended purpose.
publishDate 2019
dc.date.none.fl_str_mv 2019
2019-01-01T00:00:00Z
2020-01-14T14:33:30Z
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/article
format article
status_str publishedVersion
dc.identifier.uri.fl_str_mv http://hdl.handle.net/10400.6/8255
url http://hdl.handle.net/10400.6/8255
dc.language.iso.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv 10.3390/electronics8121499
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv application/pdf
dc.source.none.fl_str_mv reponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron:RCAAP
instname_str Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron_str RCAAP
institution RCAAP
reponame_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
collection Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository.name.fl_str_mv Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
repository.mail.fl_str_mv
_version_ 1799136380245245952