2D LiDAR-based perception for under canopy autonomous scouting of small ground robots within narrow lanes of agricultural fields

Detalhes bibliográficos
Autor(a) principal: Higuti, Vitor Akihiro Hisano
Data de Publicação: 2021
Tipo de documento: Tese
Idioma: eng
Título da fonte: Biblioteca Digital de Teses e Dissertações da USP
Texto Completo: https://www.teses.usp.br/teses/disponiveis/18/18149/tde-31052021-194235/
Resumo: Lightweight ground robots have been increasingly showing their potential to tackle agricultural tasks that require time-consuming human labor thus limited in detail or area coverage. A task that benefits several modern agricultural practices is scouting (walking though the field). However, a reliable autonomous navigation is still a challenge. A robot needs to deal with the agricultural field dynamism and unpredictable obstacles, e.g. humans and machines. In particular, corn and sorghum present an additional issue due to the standard way of cultivation: Narrow sub-meter lanes that become even less visible on later stages due to dense coverage and spreading of leaves. This condition heavily influences the sensors by provoking frequent occlusions, misreadings and other situations out of their working range. In such context, three questions arise: 1) Can the unexplored potential of Light Detection and Ranging (LiDAR) sensor suffice to interpret a narrow lane crop environment without artificial landmarks? 2) Does the search of a best line representation for crop rows really represent the problem? 3) How can lateral distance estimation be improved through sensor fusion? To answer these three questions, Perception LiDAR (PL) has been developed to estimate lateral distance to crop rows using a 2D LiDAR as the core sensor. An Extended Kalman Filter enables the use of embedded odometry and inertial measurements. Manual ground truth has shown that the task of finding best line representation from LiDAR data is challenging, and it could have as low as 54% of line estimates within 0.05 m error. Nonetheless, the proposed method has enabled 72 km of autonomous operation of multiple robots. Due to unreliable RTK-GNSS signal under canopy, PL outputs significantly outperform the GNSS-based positioning, which may not even be in the current lane. The extensive field testing happened in multiple corn and sorghum fields between 2017 and 2020. For continuous lane, i.e. at least one of the side rows exists, the success rate to finish the desired segment of autonomous navigation is 89.76%. The higher performance for LiDAR input with significant less presence of objects other than stalks, and also the failure concentration on sensor occlusion and gaps, both situations with low to none visible rows, strongly indicates that the problem is not best line fitting, but rather a classification one where the end goal is to find stalks. In summary, although PL does not provide a fully intervention-free for within crop rows navigation, its current capabilities relieve operators from the tedious task of manually driving the robot the whole time and pave the way towards a fully autonomous agricultural robot.
id USP_9e6c1f3eb58dbe23b998b90e43723d15
oai_identifier_str oai:teses.usp.br:tde-31052021-194235
network_acronym_str USP
network_name_str Biblioteca Digital de Teses e Dissertações da USP
repository_id_str 2721
spelling 2D LiDAR-based perception for under canopy autonomous scouting of small ground robots within narrow lanes of agricultural fieldsPercepção baseada em LiDAR 2D para movimentação autônoma embaixo da folhagem de pequenos robôs terrestres em faixas estreitas de plantações agrícolasAgricultural robotsLiDARLiDARMobile roboticsPercepçãoPerceptionRobôs agrícolasRobótica móvelLightweight ground robots have been increasingly showing their potential to tackle agricultural tasks that require time-consuming human labor thus limited in detail or area coverage. A task that benefits several modern agricultural practices is scouting (walking though the field). However, a reliable autonomous navigation is still a challenge. A robot needs to deal with the agricultural field dynamism and unpredictable obstacles, e.g. humans and machines. In particular, corn and sorghum present an additional issue due to the standard way of cultivation: Narrow sub-meter lanes that become even less visible on later stages due to dense coverage and spreading of leaves. This condition heavily influences the sensors by provoking frequent occlusions, misreadings and other situations out of their working range. In such context, three questions arise: 1) Can the unexplored potential of Light Detection and Ranging (LiDAR) sensor suffice to interpret a narrow lane crop environment without artificial landmarks? 2) Does the search of a best line representation for crop rows really represent the problem? 3) How can lateral distance estimation be improved through sensor fusion? To answer these three questions, Perception LiDAR (PL) has been developed to estimate lateral distance to crop rows using a 2D LiDAR as the core sensor. An Extended Kalman Filter enables the use of embedded odometry and inertial measurements. Manual ground truth has shown that the task of finding best line representation from LiDAR data is challenging, and it could have as low as 54% of line estimates within 0.05 m error. Nonetheless, the proposed method has enabled 72 km of autonomous operation of multiple robots. Due to unreliable RTK-GNSS signal under canopy, PL outputs significantly outperform the GNSS-based positioning, which may not even be in the current lane. The extensive field testing happened in multiple corn and sorghum fields between 2017 and 2020. For continuous lane, i.e. at least one of the side rows exists, the success rate to finish the desired segment of autonomous navigation is 89.76%. The higher performance for LiDAR input with significant less presence of objects other than stalks, and also the failure concentration on sensor occlusion and gaps, both situations with low to none visible rows, strongly indicates that the problem is not best line fitting, but rather a classification one where the end goal is to find stalks. In summary, although PL does not provide a fully intervention-free for within crop rows navigation, its current capabilities relieve operators from the tedious task of manually driving the robot the whole time and pave the way towards a fully autonomous agricultural robot.Robôs terrestres leves têm mostrado crescente potencial em tarefas agrícolas que requerem moroso trabalho humano, portanto, limitadas em detalhes ou abrangência de área. Uma atividade que beneficia várias práticas agrícolas modernas é o monitoramento (caminhando pelo campo). No entanto, uma navegação autônoma confiável continua um desafio. Um robô precisa enfrentar o dinamismo do campo agrícola e obstáculos imprevisíveis, e.g. humanos e máquinas. Em particular, o milho e o sorgo apresentam uma dificuldade adicional devido à forma padrão de cultivo: fileiras espaçadas por menos de um metro de largura que ficam ainda menos visíveis em estágios posteriores devido à densa cobertura e espalhamento das folhas. Tal condição influencia fortemente os sensores, provocando frequentes oclusões, erros de leitura e outras situações fora de sua faixa de trabalho. Nesse contexto, surgem três questões: 1) O potencial inexplorado do sensor LiDAR é suficiente para interpretar um ambiente de cultura de ruas estreitas sem marcos artificiais? 2) A busca da melhor reta para representar fileiras de cultura retrata o problema? 3) Como a estimativa da distância lateral pode ser melhorada através da fusão de sensores? Para responder às três perguntas, Perception LiDAR (PL) foi desenvolvido para estimar a distância lateral às fileiras com um LiDAR 2D como sensor principal. O filtro Kalman estendido permite o uso de odometria e medições inerciais. A aferição manual dos dados mostrou que a tarefa de encontrar a melhor reta a partir dos dados LiDAR é desafiadora e pode ter somente 54% das estimativas com erro inferior a 0,05 m. No entanto, o método proposto permitiu a operação autônoma de vários robôs por 72 km. Devido ao sinal RTK-GNSS ser pouco confiável sob cobertura, as estimativa por PL superam significativamente o posicionamento baseado em GNSS, que pode até indicar que a plataforma está em outra rua. O extenso teste de campo ocorreu em vários campos de milho e sorgo entre 2017 e 2020. Para ruas contínuas, ou seja, existe pelo menos uma das fileiras laterais, a taxa de sucesso para terminar o segmento desejado com navegação autônoma é de 89,76%. O desempenho superior ao remover grande parte de objetos diferentes de caule da leitura do sensor, e também a concentração de falhas nas lacunas e quando há oclusão do sensor, ambas situações com fileiras pouco a não visíveis, indicam que o problema não é o de melhor ajuste de reta, mas sim classificação em que o objetivo final é encontrar caules. Em resumo, embora PL não forneça uma navegação totalmente livre de intervenção entre as fileiras, suas capacidades atuais liberam os operadores da tediosa tarefa de dirigir manualmente o robô o tempo todo e pavimenta o caminho para um robô agrícola totalmente autônomo.Biblioteca Digitais de Teses e Dissertações da USPAroca, Rafael VidalBecker, MarceloHiguti, Vitor Akihiro Hisano2021-03-29info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/doctoralThesisapplication/pdfhttps://www.teses.usp.br/teses/disponiveis/18/18149/tde-31052021-194235/reponame:Biblioteca Digital de Teses e Dissertações da USPinstname:Universidade de São Paulo (USP)instacron:USPLiberar o conteúdo para acesso público.info:eu-repo/semantics/openAccesseng2021-06-22T00:02:02Zoai:teses.usp.br:tde-31052021-194235Biblioteca Digital de Teses e Dissertaçõeshttp://www.teses.usp.br/PUBhttp://www.teses.usp.br/cgi-bin/mtd2br.plvirginia@if.usp.br|| atendimento@aguia.usp.br||virginia@if.usp.bropendoar:27212021-06-22T00:02:02Biblioteca Digital de Teses e Dissertações da USP - Universidade de São Paulo (USP)false
dc.title.none.fl_str_mv 2D LiDAR-based perception for under canopy autonomous scouting of small ground robots within narrow lanes of agricultural fields
Percepção baseada em LiDAR 2D para movimentação autônoma embaixo da folhagem de pequenos robôs terrestres em faixas estreitas de plantações agrícolas
title 2D LiDAR-based perception for under canopy autonomous scouting of small ground robots within narrow lanes of agricultural fields
spellingShingle 2D LiDAR-based perception for under canopy autonomous scouting of small ground robots within narrow lanes of agricultural fields
Higuti, Vitor Akihiro Hisano
Agricultural robots
LiDAR
LiDAR
Mobile robotics
Percepção
Perception
Robôs agrícolas
Robótica móvel
title_short 2D LiDAR-based perception for under canopy autonomous scouting of small ground robots within narrow lanes of agricultural fields
title_full 2D LiDAR-based perception for under canopy autonomous scouting of small ground robots within narrow lanes of agricultural fields
title_fullStr 2D LiDAR-based perception for under canopy autonomous scouting of small ground robots within narrow lanes of agricultural fields
title_full_unstemmed 2D LiDAR-based perception for under canopy autonomous scouting of small ground robots within narrow lanes of agricultural fields
title_sort 2D LiDAR-based perception for under canopy autonomous scouting of small ground robots within narrow lanes of agricultural fields
author Higuti, Vitor Akihiro Hisano
author_facet Higuti, Vitor Akihiro Hisano
author_role author
dc.contributor.none.fl_str_mv Aroca, Rafael Vidal
Becker, Marcelo
dc.contributor.author.fl_str_mv Higuti, Vitor Akihiro Hisano
dc.subject.por.fl_str_mv Agricultural robots
LiDAR
LiDAR
Mobile robotics
Percepção
Perception
Robôs agrícolas
Robótica móvel
topic Agricultural robots
LiDAR
LiDAR
Mobile robotics
Percepção
Perception
Robôs agrícolas
Robótica móvel
description Lightweight ground robots have been increasingly showing their potential to tackle agricultural tasks that require time-consuming human labor thus limited in detail or area coverage. A task that benefits several modern agricultural practices is scouting (walking though the field). However, a reliable autonomous navigation is still a challenge. A robot needs to deal with the agricultural field dynamism and unpredictable obstacles, e.g. humans and machines. In particular, corn and sorghum present an additional issue due to the standard way of cultivation: Narrow sub-meter lanes that become even less visible on later stages due to dense coverage and spreading of leaves. This condition heavily influences the sensors by provoking frequent occlusions, misreadings and other situations out of their working range. In such context, three questions arise: 1) Can the unexplored potential of Light Detection and Ranging (LiDAR) sensor suffice to interpret a narrow lane crop environment without artificial landmarks? 2) Does the search of a best line representation for crop rows really represent the problem? 3) How can lateral distance estimation be improved through sensor fusion? To answer these three questions, Perception LiDAR (PL) has been developed to estimate lateral distance to crop rows using a 2D LiDAR as the core sensor. An Extended Kalman Filter enables the use of embedded odometry and inertial measurements. Manual ground truth has shown that the task of finding best line representation from LiDAR data is challenging, and it could have as low as 54% of line estimates within 0.05 m error. Nonetheless, the proposed method has enabled 72 km of autonomous operation of multiple robots. Due to unreliable RTK-GNSS signal under canopy, PL outputs significantly outperform the GNSS-based positioning, which may not even be in the current lane. The extensive field testing happened in multiple corn and sorghum fields between 2017 and 2020. For continuous lane, i.e. at least one of the side rows exists, the success rate to finish the desired segment of autonomous navigation is 89.76%. The higher performance for LiDAR input with significant less presence of objects other than stalks, and also the failure concentration on sensor occlusion and gaps, both situations with low to none visible rows, strongly indicates that the problem is not best line fitting, but rather a classification one where the end goal is to find stalks. In summary, although PL does not provide a fully intervention-free for within crop rows navigation, its current capabilities relieve operators from the tedious task of manually driving the robot the whole time and pave the way towards a fully autonomous agricultural robot.
publishDate 2021
dc.date.none.fl_str_mv 2021-03-29
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/doctoralThesis
format doctoralThesis
status_str publishedVersion
dc.identifier.uri.fl_str_mv https://www.teses.usp.br/teses/disponiveis/18/18149/tde-31052021-194235/
url https://www.teses.usp.br/teses/disponiveis/18/18149/tde-31052021-194235/
dc.language.iso.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv
dc.rights.driver.fl_str_mv Liberar o conteúdo para acesso público.
info:eu-repo/semantics/openAccess
rights_invalid_str_mv Liberar o conteúdo para acesso público.
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv application/pdf
dc.coverage.none.fl_str_mv
dc.publisher.none.fl_str_mv Biblioteca Digitais de Teses e Dissertações da USP
publisher.none.fl_str_mv Biblioteca Digitais de Teses e Dissertações da USP
dc.source.none.fl_str_mv
reponame:Biblioteca Digital de Teses e Dissertações da USP
instname:Universidade de São Paulo (USP)
instacron:USP
instname_str Universidade de São Paulo (USP)
instacron_str USP
institution USP
reponame_str Biblioteca Digital de Teses e Dissertações da USP
collection Biblioteca Digital de Teses e Dissertações da USP
repository.name.fl_str_mv Biblioteca Digital de Teses e Dissertações da USP - Universidade de São Paulo (USP)
repository.mail.fl_str_mv virginia@if.usp.br|| atendimento@aguia.usp.br||virginia@if.usp.br
_version_ 1809091220616511488