Handling Occlusions in Augmented Reality for Mobile Devices
Autor(a) principal: | |
---|---|
Data de Publicação: | 2019 |
Outros Autores: | , , , , |
Tipo de documento: | Artigo de conferência |
Idioma: | por |
Título da fonte: | Repositório Institucional da UNESP |
Texto Completo: | http://dx.doi.org/10.1109/SVR.2019.00033 http://hdl.handle.net/11449/195395 |
Resumo: | The coherent display of the spatial relationship between real and virtual elements is a desirable feature for realistic Augmented Reality (AR) systems. The system must be able to automatically estimate the depths of the real and virtual elements that compose the scene and display in the foreground of the frame the element closest to the camera. In other words, real and virtual elements must hide (totally or partially) each other according to their depths in the frame. This paper presents a method for occlusion treatment of scene elements for mobile applications. The solution presented uses pixel proportions, relative to the total pixels of the frame, occupied by the real and virtual elements to estimate their depths. Then, a skin color model based segmentation method is applied to extract the real element in frames in which it is in the foreground of the scene and, as a result, must overlap the virtual object. The results of the evaluation of the proposed method showed its superiority to those found in the literature when a set of important features are compared. |
id |
UNSP_93549bfa42ca844f4526cc95e7624232 |
---|---|
oai_identifier_str |
oai:repositorio.unesp.br:11449/195395 |
network_acronym_str |
UNSP |
network_name_str |
Repositório Institucional da UNESP |
repository_id_str |
2946 |
spelling |
Handling Occlusions in Augmented Reality for Mobile DevicesAugmented RealityMobile DevicesHandling OcclusionThe coherent display of the spatial relationship between real and virtual elements is a desirable feature for realistic Augmented Reality (AR) systems. The system must be able to automatically estimate the depths of the real and virtual elements that compose the scene and display in the foreground of the frame the element closest to the camera. In other words, real and virtual elements must hide (totally or partially) each other according to their depths in the frame. This paper presents a method for occlusion treatment of scene elements for mobile applications. The solution presented uses pixel proportions, relative to the total pixels of the frame, occupied by the real and virtual elements to estimate their depths. Then, a skin color model based segmentation method is applied to extract the real element in frames in which it is in the foreground of the scene and, as a result, must overlap the virtual object. The results of the evaluation of the proposed method showed its superiority to those found in the literature when a set of important features are compared.Univ Teenol Fed Parana, Cornelio Procopio, BrazilUniv Estadual Paulista, Bauru, SP, BrazilUniv Estadual Paulista, Bauru, SP, BrazilIeee Computer SocUniv Teenol Fed ParanaUniversidade Estadual Paulista (Unesp)Sanches, Silvio R. R.Silva, Vitor V. G.Sementille, Antonio C. [UNESP]Correa, Cleber G.Oliveira, ClaitonIEEE2020-12-10T17:33:17Z2020-12-10T17:33:17Z2019-01-01info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/conferenceObject112-119http://dx.doi.org/10.1109/SVR.2019.000332019 21st Symposium On Virtual And Augmented Reality (svr 2019). Los Alamitos: Ieee Computer Soc, p. 112-119, 2019.2377-746Xhttp://hdl.handle.net/11449/19539510.1109/SVR.2019.00033WOS:000534536600017Web of Sciencereponame:Repositório Institucional da UNESPinstname:Universidade Estadual Paulista (UNESP)instacron:UNESPpor2019 21st Symposium On Virtual And Augmented Reality (svr 2019)info:eu-repo/semantics/openAccess2021-10-22T20:18:52Zoai:repositorio.unesp.br:11449/195395Repositório InstitucionalPUBhttp://repositorio.unesp.br/oai/requestopendoar:29462024-08-05T21:10:34.389463Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP)false |
dc.title.none.fl_str_mv |
Handling Occlusions in Augmented Reality for Mobile Devices |
title |
Handling Occlusions in Augmented Reality for Mobile Devices |
spellingShingle |
Handling Occlusions in Augmented Reality for Mobile Devices Sanches, Silvio R. R. Augmented Reality Mobile Devices Handling Occlusion |
title_short |
Handling Occlusions in Augmented Reality for Mobile Devices |
title_full |
Handling Occlusions in Augmented Reality for Mobile Devices |
title_fullStr |
Handling Occlusions in Augmented Reality for Mobile Devices |
title_full_unstemmed |
Handling Occlusions in Augmented Reality for Mobile Devices |
title_sort |
Handling Occlusions in Augmented Reality for Mobile Devices |
author |
Sanches, Silvio R. R. |
author_facet |
Sanches, Silvio R. R. Silva, Vitor V. G. Sementille, Antonio C. [UNESP] Correa, Cleber G. Oliveira, Claiton IEEE |
author_role |
author |
author2 |
Silva, Vitor V. G. Sementille, Antonio C. [UNESP] Correa, Cleber G. Oliveira, Claiton IEEE |
author2_role |
author author author author author |
dc.contributor.none.fl_str_mv |
Univ Teenol Fed Parana Universidade Estadual Paulista (Unesp) |
dc.contributor.author.fl_str_mv |
Sanches, Silvio R. R. Silva, Vitor V. G. Sementille, Antonio C. [UNESP] Correa, Cleber G. Oliveira, Claiton IEEE |
dc.subject.por.fl_str_mv |
Augmented Reality Mobile Devices Handling Occlusion |
topic |
Augmented Reality Mobile Devices Handling Occlusion |
description |
The coherent display of the spatial relationship between real and virtual elements is a desirable feature for realistic Augmented Reality (AR) systems. The system must be able to automatically estimate the depths of the real and virtual elements that compose the scene and display in the foreground of the frame the element closest to the camera. In other words, real and virtual elements must hide (totally or partially) each other according to their depths in the frame. This paper presents a method for occlusion treatment of scene elements for mobile applications. The solution presented uses pixel proportions, relative to the total pixels of the frame, occupied by the real and virtual elements to estimate their depths. Then, a skin color model based segmentation method is applied to extract the real element in frames in which it is in the foreground of the scene and, as a result, must overlap the virtual object. The results of the evaluation of the proposed method showed its superiority to those found in the literature when a set of important features are compared. |
publishDate |
2019 |
dc.date.none.fl_str_mv |
2019-01-01 2020-12-10T17:33:17Z 2020-12-10T17:33:17Z |
dc.type.status.fl_str_mv |
info:eu-repo/semantics/publishedVersion |
dc.type.driver.fl_str_mv |
info:eu-repo/semantics/conferenceObject |
format |
conferenceObject |
status_str |
publishedVersion |
dc.identifier.uri.fl_str_mv |
http://dx.doi.org/10.1109/SVR.2019.00033 2019 21st Symposium On Virtual And Augmented Reality (svr 2019). Los Alamitos: Ieee Computer Soc, p. 112-119, 2019. 2377-746X http://hdl.handle.net/11449/195395 10.1109/SVR.2019.00033 WOS:000534536600017 |
url |
http://dx.doi.org/10.1109/SVR.2019.00033 http://hdl.handle.net/11449/195395 |
identifier_str_mv |
2019 21st Symposium On Virtual And Augmented Reality (svr 2019). Los Alamitos: Ieee Computer Soc, p. 112-119, 2019. 2377-746X 10.1109/SVR.2019.00033 WOS:000534536600017 |
dc.language.iso.fl_str_mv |
por |
language |
por |
dc.relation.none.fl_str_mv |
2019 21st Symposium On Virtual And Augmented Reality (svr 2019) |
dc.rights.driver.fl_str_mv |
info:eu-repo/semantics/openAccess |
eu_rights_str_mv |
openAccess |
dc.format.none.fl_str_mv |
112-119 |
dc.publisher.none.fl_str_mv |
Ieee Computer Soc |
publisher.none.fl_str_mv |
Ieee Computer Soc |
dc.source.none.fl_str_mv |
Web of Science reponame:Repositório Institucional da UNESP instname:Universidade Estadual Paulista (UNESP) instacron:UNESP |
instname_str |
Universidade Estadual Paulista (UNESP) |
instacron_str |
UNESP |
institution |
UNESP |
reponame_str |
Repositório Institucional da UNESP |
collection |
Repositório Institucional da UNESP |
repository.name.fl_str_mv |
Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP) |
repository.mail.fl_str_mv |
|
_version_ |
1808129293527547904 |