A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery
Autor(a) principal: | |
---|---|
Data de Publicação: | 2020 |
Outros Autores: | , , , , , , , , , , |
Tipo de documento: | Artigo |
Idioma: | eng |
Título da fonte: | Repositório Institucional da UNESP |
Texto Completo: | http://dx.doi.org/10.1016/j.isprsjprs.2019.12.010 http://hdl.handle.net/11449/197328 |
Resumo: | Visual inspection has been a common practice to determine the number of plants in orchards, which is a labor-intensive and time-consuming task. Deep learning algorithms have demonstrated great potential for counting plants on unmanned aerial vehicle (UAV)-borne sensor imagery. This paper presents a convolutional neural network (CNN) approach to address the challenge of estimating the number of citrus trees in highly dense orchards from UAV multispectral images. The method estimates a dense map with the confidence that a plant occurs in each pixel. A flight was conducted over an orchard of Valencia-orange trees planted in linear fashion, using a multispectral camera with four bands in green, red, red-edge and near-infrared. The approach was assessed considering the individual bands and their combinations. A total of 37,353 trees were adopted in point feature to evaluate the method. A variation of a (0.5; 1.0 and 1.5) was used to generate different ground truth confidence maps. Different stages (T) were also used to refine the confidence map predicted. To evaluate the robustness of our method, we compared it with two state-of-the-art object detection CNN methods (Faster R-CNN and RetinaNet). The results show better performance with the combination of green, red and near-infrared bands, achieving a Mean Absolute Error (MAE), Mean Square Error (MSE), R-2 and Normalized Root-MeanSquared Error (NRMSE) of 2.28, 9.82, 0.96 and 0.05, respectively. This band combination, when adopting sigma = 1 and a stage (T = 8), resulted in an R-2, MAE, Precision, Recall and F1 of 0.97, 2.05, 0.95, 0.96 and 0.95, respectively. Our method outperforms significantly object detection methods for counting and geolocation. It was concluded that our CNN approach developed to estimate the number and geolocation of citrus trees in highdensity orchards is satisfactory and is an effective strategy to replace the traditional visual inspection method to determine the number of plants in orchards trees. |
id |
UNSP_5bf2b65e7850c9a313a647dcf71dcb07 |
---|---|
oai_identifier_str |
oai:repositorio.unesp.br:11449/197328 |
network_acronym_str |
UNSP |
network_name_str |
Repositório Institucional da UNESP |
repository_id_str |
2946 |
spelling |
A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imageryDeep learningMultispectral imageUAV-borne sensorObject detectionCitrus tree countingOrchardVisual inspection has been a common practice to determine the number of plants in orchards, which is a labor-intensive and time-consuming task. Deep learning algorithms have demonstrated great potential for counting plants on unmanned aerial vehicle (UAV)-borne sensor imagery. This paper presents a convolutional neural network (CNN) approach to address the challenge of estimating the number of citrus trees in highly dense orchards from UAV multispectral images. The method estimates a dense map with the confidence that a plant occurs in each pixel. A flight was conducted over an orchard of Valencia-orange trees planted in linear fashion, using a multispectral camera with four bands in green, red, red-edge and near-infrared. The approach was assessed considering the individual bands and their combinations. A total of 37,353 trees were adopted in point feature to evaluate the method. A variation of a (0.5; 1.0 and 1.5) was used to generate different ground truth confidence maps. Different stages (T) were also used to refine the confidence map predicted. To evaluate the robustness of our method, we compared it with two state-of-the-art object detection CNN methods (Faster R-CNN and RetinaNet). The results show better performance with the combination of green, red and near-infrared bands, achieving a Mean Absolute Error (MAE), Mean Square Error (MSE), R-2 and Normalized Root-MeanSquared Error (NRMSE) of 2.28, 9.82, 0.96 and 0.05, respectively. This band combination, when adopting sigma = 1 and a stage (T = 8), resulted in an R-2, MAE, Precision, Recall and F1 of 0.97, 2.05, 0.95, 0.96 and 0.95, respectively. Our method outperforms significantly object detection methods for counting and geolocation. It was concluded that our CNN approach developed to estimate the number and geolocation of citrus trees in highdensity orchards is satisfactory and is an effective strategy to replace the traditional visual inspection method to determine the number of plants in orchards trees.Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)FundectUniv Fed Mato Grosso do Sul, Fac Engn Architecture & Urbanism & Geog, Campo Grande, MS, BrazilUniv Fed Mato Grosso do Sul, Fac Comp Sci, Campo Grande, MS, BrazilUniv Western Sao Paulo, Fac Agron, Sao Paulo, BrazilUniv Western Sao Paulo, Fac Engn & Architecture, Sao Paulo, BrazilSoo Paulo State Univ, Dept Cartog Sci, BR-19060900 Presidente Prudente, SP, BrazilUniv Western Sao Paulo, Fac Comp Sci, Sao Paulo, BrazilUniv Waterloo, Dept Geog & Environm Management, Waterloo, ON N2L 3G1, CanadaUniv Waterloo, Dept Syst Design Engn, Waterloo, ON N2L 3G1, CanadaCNPq: 433783/2018-4CNPq: 304173/2016-9CAPES: 88881.311850/2018-01Fundect: 59/300.066/2015Elsevier B.V.Universidade Federal de Mato Grosso do Sul (UFMS)Univ Western Sao PauloSoo Paulo State UnivUniv WaterlooUniversidade Estadual Paulista (Unesp)Osco, Lucas PradoArruda, Mauro dos Santos deMarcato Junior, JoseSilva, Neemias Buceli daMarques Ramos, Ana PaulaSaito Moryia, Erika AkemiImai, Nilton NobuhiroPereira, Danillo RobertoCreste, Jose EduardoMatsubara, Edson TakashiLi, JonathanGoncalves, Wesley Nunes2020-12-10T20:13:32Z2020-12-10T20:13:32Z2020-02-01info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/article97-106http://dx.doi.org/10.1016/j.isprsjprs.2019.12.010Isprs Journal Of Photogrammetry And Remote Sensing. Amsterdam: Elsevier, v. 160, p. 97-106, 2020.0924-2716http://hdl.handle.net/11449/19732810.1016/j.isprsjprs.2019.12.010WOS:00051052550000729857711025053300000-0003-0516-0567Web of Sciencereponame:Repositório Institucional da UNESPinstname:Universidade Estadual Paulista (UNESP)instacron:UNESPengIsprs Journal Of Photogrammetry And Remote Sensinginfo:eu-repo/semantics/openAccess2024-06-18T15:02:06Zoai:repositorio.unesp.br:11449/197328Repositório InstitucionalPUBhttp://repositorio.unesp.br/oai/requestopendoar:29462024-08-05T23:26:45.698690Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP)false |
dc.title.none.fl_str_mv |
A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery |
title |
A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery |
spellingShingle |
A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery Osco, Lucas Prado Deep learning Multispectral image UAV-borne sensor Object detection Citrus tree counting Orchard |
title_short |
A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery |
title_full |
A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery |
title_fullStr |
A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery |
title_full_unstemmed |
A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery |
title_sort |
A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery |
author |
Osco, Lucas Prado |
author_facet |
Osco, Lucas Prado Arruda, Mauro dos Santos de Marcato Junior, Jose Silva, Neemias Buceli da Marques Ramos, Ana Paula Saito Moryia, Erika Akemi Imai, Nilton Nobuhiro Pereira, Danillo Roberto Creste, Jose Eduardo Matsubara, Edson Takashi Li, Jonathan Goncalves, Wesley Nunes |
author_role |
author |
author2 |
Arruda, Mauro dos Santos de Marcato Junior, Jose Silva, Neemias Buceli da Marques Ramos, Ana Paula Saito Moryia, Erika Akemi Imai, Nilton Nobuhiro Pereira, Danillo Roberto Creste, Jose Eduardo Matsubara, Edson Takashi Li, Jonathan Goncalves, Wesley Nunes |
author2_role |
author author author author author author author author author author author |
dc.contributor.none.fl_str_mv |
Universidade Federal de Mato Grosso do Sul (UFMS) Univ Western Sao Paulo Soo Paulo State Univ Univ Waterloo Universidade Estadual Paulista (Unesp) |
dc.contributor.author.fl_str_mv |
Osco, Lucas Prado Arruda, Mauro dos Santos de Marcato Junior, Jose Silva, Neemias Buceli da Marques Ramos, Ana Paula Saito Moryia, Erika Akemi Imai, Nilton Nobuhiro Pereira, Danillo Roberto Creste, Jose Eduardo Matsubara, Edson Takashi Li, Jonathan Goncalves, Wesley Nunes |
dc.subject.por.fl_str_mv |
Deep learning Multispectral image UAV-borne sensor Object detection Citrus tree counting Orchard |
topic |
Deep learning Multispectral image UAV-borne sensor Object detection Citrus tree counting Orchard |
description |
Visual inspection has been a common practice to determine the number of plants in orchards, which is a labor-intensive and time-consuming task. Deep learning algorithms have demonstrated great potential for counting plants on unmanned aerial vehicle (UAV)-borne sensor imagery. This paper presents a convolutional neural network (CNN) approach to address the challenge of estimating the number of citrus trees in highly dense orchards from UAV multispectral images. The method estimates a dense map with the confidence that a plant occurs in each pixel. A flight was conducted over an orchard of Valencia-orange trees planted in linear fashion, using a multispectral camera with four bands in green, red, red-edge and near-infrared. The approach was assessed considering the individual bands and their combinations. A total of 37,353 trees were adopted in point feature to evaluate the method. A variation of a (0.5; 1.0 and 1.5) was used to generate different ground truth confidence maps. Different stages (T) were also used to refine the confidence map predicted. To evaluate the robustness of our method, we compared it with two state-of-the-art object detection CNN methods (Faster R-CNN and RetinaNet). The results show better performance with the combination of green, red and near-infrared bands, achieving a Mean Absolute Error (MAE), Mean Square Error (MSE), R-2 and Normalized Root-MeanSquared Error (NRMSE) of 2.28, 9.82, 0.96 and 0.05, respectively. This band combination, when adopting sigma = 1 and a stage (T = 8), resulted in an R-2, MAE, Precision, Recall and F1 of 0.97, 2.05, 0.95, 0.96 and 0.95, respectively. Our method outperforms significantly object detection methods for counting and geolocation. It was concluded that our CNN approach developed to estimate the number and geolocation of citrus trees in highdensity orchards is satisfactory and is an effective strategy to replace the traditional visual inspection method to determine the number of plants in orchards trees. |
publishDate |
2020 |
dc.date.none.fl_str_mv |
2020-12-10T20:13:32Z 2020-12-10T20:13:32Z 2020-02-01 |
dc.type.status.fl_str_mv |
info:eu-repo/semantics/publishedVersion |
dc.type.driver.fl_str_mv |
info:eu-repo/semantics/article |
format |
article |
status_str |
publishedVersion |
dc.identifier.uri.fl_str_mv |
http://dx.doi.org/10.1016/j.isprsjprs.2019.12.010 Isprs Journal Of Photogrammetry And Remote Sensing. Amsterdam: Elsevier, v. 160, p. 97-106, 2020. 0924-2716 http://hdl.handle.net/11449/197328 10.1016/j.isprsjprs.2019.12.010 WOS:000510525500007 2985771102505330 0000-0003-0516-0567 |
url |
http://dx.doi.org/10.1016/j.isprsjprs.2019.12.010 http://hdl.handle.net/11449/197328 |
identifier_str_mv |
Isprs Journal Of Photogrammetry And Remote Sensing. Amsterdam: Elsevier, v. 160, p. 97-106, 2020. 0924-2716 10.1016/j.isprsjprs.2019.12.010 WOS:000510525500007 2985771102505330 0000-0003-0516-0567 |
dc.language.iso.fl_str_mv |
eng |
language |
eng |
dc.relation.none.fl_str_mv |
Isprs Journal Of Photogrammetry And Remote Sensing |
dc.rights.driver.fl_str_mv |
info:eu-repo/semantics/openAccess |
eu_rights_str_mv |
openAccess |
dc.format.none.fl_str_mv |
97-106 |
dc.publisher.none.fl_str_mv |
Elsevier B.V. |
publisher.none.fl_str_mv |
Elsevier B.V. |
dc.source.none.fl_str_mv |
Web of Science reponame:Repositório Institucional da UNESP instname:Universidade Estadual Paulista (UNESP) instacron:UNESP |
instname_str |
Universidade Estadual Paulista (UNESP) |
instacron_str |
UNESP |
institution |
UNESP |
reponame_str |
Repositório Institucional da UNESP |
collection |
Repositório Institucional da UNESP |
repository.name.fl_str_mv |
Repositório Institucional da UNESP - Universidade Estadual Paulista (UNESP) |
repository.mail.fl_str_mv |
|
_version_ |
1808129521828757504 |