Semantic Guided Interactive Image Retrieval for plant identification
Autor(a) principal: | |
---|---|
Data de Publicação: | 2018 |
Outros Autores: | , |
Tipo de documento: | Artigo |
Idioma: | eng |
Título da fonte: | Repositório Institucional da UNESP |
Texto Completo: | http://dx.doi.org/10.1016/j.eswa.2017.08.035 http://hdl.handle.net/11449/175105 |
Resumo: | A lot of images are currently generated in many domains, requiring specialized knowledge of identification and analysis. From one standpoint, many advances have been accomplished in the development of image retrieval techniques based on visual image properties. However, the semantic gap between low-level features and high-level concepts still represents a challenging scenario. On another standpoint, knowledge has also been structured in many fields by ontologies. A promising solution for bridging the semantic gap consists in combining the information from low-level features with semantic knowledge. This work proposes a novel graph-based approach denominated Semantic Interactive Image Retrieval (SIIR) capable of combining Content Based Image Retrieval (CBIR), unsupervised learning, ontology techniques and interactive retrieval. To the best of our knowledge, there is no approach in the literature that combines those diverse techniques like SIIR. The proposed approach supports expert identification tasks, such as the biologist's role in plant identification of Angiosperm families. Since the system exploits information from different sources as visual content, ontology, and user interactions, the user efforts required are drastically reduced. For the semantic model, we developed a domain ontology which represents the plant properties and structures, relating features from Angiosperm families. A novel graph-based approach is proposed for combining the semantic information and the visual retrieval results. A bipartite and a discriminative attribute graph allow a semantic selection of the most discriminative attributes for plant identification tasks. The selected attributes are used for formulating a question to the user. The system updates similarity information among images based on the user's answer, thus improving the retrieval effectiveness and reducing the user's efforts required for identification tasks. The proposed method was evaluated on the popular Oxford Flowers 17 and 102 Classes datasets, yielding highly effective results in both datasets when compared to other approaches. For example, the first five retrieved images for 17 classes achieve a retrieval precision of 97.07% and for 102 classes, 91.33%. |
id |
UNSP_88c0900ab7bfc3dbc73db001850f3a46 |
---|---|
oai_identifier_str |
oai:repositorio.unesp.br:11449/175105 |
network_acronym_str |
UNSP |
network_name_str |
Repositório Institucional da UNESP |
repository_id_str |
2946 |
spelling |
Semantic Guided Interactive Image Retrieval for plant identificationInteractive image retrievalOntologySemantic gapUnsupervised learningA lot of images are currently generated in many domains, requiring specialized knowledge of identification and analysis. From one standpoint, many advances have been accomplished in the development of image retrieval techniques based on visual image properties. However, the semantic gap between low-level features and high-level concepts still represents a challenging scenario. On another standpoint, knowledge has also been structured in many fields by ontologies. A promising solution for bridging the semantic gap consists in combining the information from low-level features with semantic knowledge. This work proposes a novel graph-based approach denominated Semantic Interactive Image Retrieval (SIIR) capable of combining Content Based Image Retrieval (CBIR), unsupervised learning, ontology techniques and interactive retrieval. To the best of our knowledge, there is no approach in the literature that combines those diverse techniques like SIIR. The proposed approach supports expert identification tasks, such as the biologist's role in plant identification of Angiosperm families. Since the system exploits information from different sources as visual content, ontology, and user interactions, the user efforts required are drastically reduced. For the semantic model, we developed a domain ontology which represents the plant properties and structures, relating features from Angiosperm families. A novel graph-based approach is proposed for combining the semantic information and the visual retrieval results. A bipartite and a discriminative attribute graph allow a semantic selection of the most discriminative attributes for plant identification tasks. The selected attributes are used for formulating a question to the user. The system updates similarity information among images based on the user's answer, thus improving the retrieval effectiveness and reducing the user's efforts required for identification tasks. The proposed method was evaluated on the popular Oxford Flowers 17 and 102 Classes datasets, yielding highly effective results in both datasets when compared to other approaches. For example, the first five retrieved images for 17 classes achieve a retrieval precision of 97.07% and for 102 classes, 91.33%.Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)Department of Statistics Applied Mathematics and Computing (DEMAC) State University of São Paulo (UNESP), Av. 24-A, 1515Department of Statistics Applied Mathematics and Computing (DEMAC) State University of São Paulo (UNESP), Av. 24-A, 1515FAPESP: 2013/08645-0Universidade Estadual Paulista (Unesp)Gonçalves, Filipe Marcel Fernandes [UNESP]Guilherme, Ivan Rizzo [UNESP]Pedronette, Daniel Carlos Guimarães [UNESP]2018-12-11T17:14:24Z2018-12-11T17:14:24Z2018-01-01info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/article12-26application/pdfhttp://dx.doi.org/10.1016/j.eswa.2017.08.035Expert Systems with Applications, v. 91, p. 12-26.0957-4174http://hdl.handle.net/11449/17510510.1016/j.eswa.2017.08.0352-s2.0-850285103722-s2.0-85028510372.pdfScopusreponame:Repositório Institucional da UNESPinstname:Universidade Estadual Paulista (UNESP)instacron:UNESPengExpert Systems with Applications1,271info:eu-repo/semantics/openAccess2023-12-20T06:20:32Zoai:repositorio.unesp.br:11449/175105Repositório InstitucionalPUBhttp://repositorio.unesp.br/oai/requestopendoar:29462024-08-05T20:49:03.249182Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP)false |
dc.title.none.fl_str_mv |
Semantic Guided Interactive Image Retrieval for plant identification |
title |
Semantic Guided Interactive Image Retrieval for plant identification |
spellingShingle |
Semantic Guided Interactive Image Retrieval for plant identification Gonçalves, Filipe Marcel Fernandes [UNESP] Interactive image retrieval Ontology Semantic gap Unsupervised learning |
title_short |
Semantic Guided Interactive Image Retrieval for plant identification |
title_full |
Semantic Guided Interactive Image Retrieval for plant identification |
title_fullStr |
Semantic Guided Interactive Image Retrieval for plant identification |
title_full_unstemmed |
Semantic Guided Interactive Image Retrieval for plant identification |
title_sort |
Semantic Guided Interactive Image Retrieval for plant identification |
author |
Gonçalves, Filipe Marcel Fernandes [UNESP] |
author_facet |
Gonçalves, Filipe Marcel Fernandes [UNESP] Guilherme, Ivan Rizzo [UNESP] Pedronette, Daniel Carlos Guimarães [UNESP] |
author_role |
author |
author2 |
Guilherme, Ivan Rizzo [UNESP] Pedronette, Daniel Carlos Guimarães [UNESP] |
author2_role |
author author |
dc.contributor.none.fl_str_mv |
Universidade Estadual Paulista (Unesp) |
dc.contributor.author.fl_str_mv |
Gonçalves, Filipe Marcel Fernandes [UNESP] Guilherme, Ivan Rizzo [UNESP] Pedronette, Daniel Carlos Guimarães [UNESP] |
dc.subject.por.fl_str_mv |
Interactive image retrieval Ontology Semantic gap Unsupervised learning |
topic |
Interactive image retrieval Ontology Semantic gap Unsupervised learning |
description |
A lot of images are currently generated in many domains, requiring specialized knowledge of identification and analysis. From one standpoint, many advances have been accomplished in the development of image retrieval techniques based on visual image properties. However, the semantic gap between low-level features and high-level concepts still represents a challenging scenario. On another standpoint, knowledge has also been structured in many fields by ontologies. A promising solution for bridging the semantic gap consists in combining the information from low-level features with semantic knowledge. This work proposes a novel graph-based approach denominated Semantic Interactive Image Retrieval (SIIR) capable of combining Content Based Image Retrieval (CBIR), unsupervised learning, ontology techniques and interactive retrieval. To the best of our knowledge, there is no approach in the literature that combines those diverse techniques like SIIR. The proposed approach supports expert identification tasks, such as the biologist's role in plant identification of Angiosperm families. Since the system exploits information from different sources as visual content, ontology, and user interactions, the user efforts required are drastically reduced. For the semantic model, we developed a domain ontology which represents the plant properties and structures, relating features from Angiosperm families. A novel graph-based approach is proposed for combining the semantic information and the visual retrieval results. A bipartite and a discriminative attribute graph allow a semantic selection of the most discriminative attributes for plant identification tasks. The selected attributes are used for formulating a question to the user. The system updates similarity information among images based on the user's answer, thus improving the retrieval effectiveness and reducing the user's efforts required for identification tasks. The proposed method was evaluated on the popular Oxford Flowers 17 and 102 Classes datasets, yielding highly effective results in both datasets when compared to other approaches. For example, the first five retrieved images for 17 classes achieve a retrieval precision of 97.07% and for 102 classes, 91.33%. |
publishDate |
2018 |
dc.date.none.fl_str_mv |
2018-12-11T17:14:24Z 2018-12-11T17:14:24Z 2018-01-01 |
dc.type.status.fl_str_mv |
info:eu-repo/semantics/publishedVersion |
dc.type.driver.fl_str_mv |
info:eu-repo/semantics/article |
format |
article |
status_str |
publishedVersion |
dc.identifier.uri.fl_str_mv |
http://dx.doi.org/10.1016/j.eswa.2017.08.035 Expert Systems with Applications, v. 91, p. 12-26. 0957-4174 http://hdl.handle.net/11449/175105 10.1016/j.eswa.2017.08.035 2-s2.0-85028510372 2-s2.0-85028510372.pdf |
url |
http://dx.doi.org/10.1016/j.eswa.2017.08.035 http://hdl.handle.net/11449/175105 |
identifier_str_mv |
Expert Systems with Applications, v. 91, p. 12-26. 0957-4174 10.1016/j.eswa.2017.08.035 2-s2.0-85028510372 2-s2.0-85028510372.pdf |
dc.language.iso.fl_str_mv |
eng |
language |
eng |
dc.relation.none.fl_str_mv |
Expert Systems with Applications 1,271 |
dc.rights.driver.fl_str_mv |
info:eu-repo/semantics/openAccess |
eu_rights_str_mv |
openAccess |
dc.format.none.fl_str_mv |
12-26 application/pdf |
dc.source.none.fl_str_mv |
Scopus reponame:Repositório Institucional da UNESP instname:Universidade Estadual Paulista (UNESP) instacron:UNESP |
instname_str |
Universidade Estadual Paulista (UNESP) |
instacron_str |
UNESP |
institution |
UNESP |
reponame_str |
Repositório Institucional da UNESP |
collection |
Repositório Institucional da UNESP |
repository.name.fl_str_mv |
Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP) |
repository.mail.fl_str_mv |
|
_version_ |
1808129252055318528 |