Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights

Detalhes bibliográficos
Autor(a) principal: Carvalho, Vitor
Data de Publicação: 2021
Outros Autores: Eusébio, José
Tipo de documento: Artigo
Idioma: eng
Título da fonte: Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
Texto Completo: http://hdl.handle.net/11110/2424
Resumo: The human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions through gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV, or control the bed slope. A shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform, was used. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject.
id RCAP_8687a018bf30dfbf16cbaf7e488390f6
oai_identifier_str oai:ciencipca.ipca.pt:11110/2424
network_acronym_str RCAP
network_name_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository_id_str 7160
spelling Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First InsightsBedridden PeopleKinectNatural User InterfaceShape MatchingThe human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions through gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV, or control the bed slope. A shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform, was used. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject.International Journal of Healthcare Information Systems and Informatics2022-07-06T09:24:15Z2022-07-06T09:24:15Z2021-01-01T00:00:00Zinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/articlehttp://hdl.handle.net/11110/2424oai:ciencipca.ipca.pt:11110/2424enghttp://hdl.handle.net/11110/2424Carvalho, VitorEusébio, Joséinfo:eu-repo/semantics/openAccessreponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãoinstacron:RCAAP2022-09-05T12:53:44Zoai:ciencipca.ipca.pt:11110/2424Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireopendoar:71602024-03-19T15:02:41.127690Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãofalse
dc.title.none.fl_str_mv Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights
title Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights
spellingShingle Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights
Carvalho, Vitor
Bedridden People
Kinect
Natural User Interface
Shape Matching
title_short Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights
title_full Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights
title_fullStr Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights
title_full_unstemmed Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights
title_sort Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights
author Carvalho, Vitor
author_facet Carvalho, Vitor
Eusébio, José
author_role author
author2 Eusébio, José
author2_role author
dc.contributor.author.fl_str_mv Carvalho, Vitor
Eusébio, José
dc.subject.por.fl_str_mv Bedridden People
Kinect
Natural User Interface
Shape Matching
topic Bedridden People
Kinect
Natural User Interface
Shape Matching
description The human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions through gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV, or control the bed slope. A shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform, was used. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject.
publishDate 2021
dc.date.none.fl_str_mv 2021-01-01T00:00:00Z
2022-07-06T09:24:15Z
2022-07-06T09:24:15Z
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/article
format article
status_str publishedVersion
dc.identifier.uri.fl_str_mv http://hdl.handle.net/11110/2424
oai:ciencipca.ipca.pt:11110/2424
url http://hdl.handle.net/11110/2424
identifier_str_mv oai:ciencipca.ipca.pt:11110/2424
dc.language.iso.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv http://hdl.handle.net/11110/2424
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.publisher.none.fl_str_mv International Journal of Healthcare Information Systems and Informatics
publisher.none.fl_str_mv International Journal of Healthcare Information Systems and Informatics
dc.source.none.fl_str_mv reponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron:RCAAP
instname_str Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron_str RCAAP
institution RCAAP
reponame_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
collection Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository.name.fl_str_mv Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
repository.mail.fl_str_mv
_version_ 1799129897093824512