2-DOF Laser Structure for Autonomous External Quadrotor Guidance
Autor(a) principal: | |
---|---|
Data de Publicação: | 2019 |
Tipo de documento: | Dissertação |
Idioma: | eng |
Título da fonte: | Biblioteca Digital de Teses e Dissertações da USP |
Texto Completo: | https://www.teses.usp.br/teses/disponiveis/18/18149/tde-12052020-175028/ |
Resumo: | This dissertation, 2-DOF Laser Structure for Autonomous External Quadrotor Guidance, has the goal of designing a novel method to attract an aerial robot to a ground vehicle using a system composed of a camera and a laser for autonomous landing. This applications is designed to be used alongside other guiding methods. This is a complex problem that requires work on different fields on both the ground and the aerial robot. On the ground station, know as Mirã, a structure was designed using 3D printed parts, and two stepper motor adjusted for 1:16 microstepping to precisely orient a camera and a laser with 2 DoF. The motor is controlled by a computer vision algorithm that uses four green LED on the aerial robot. With the combination of a light source filter and a color filter along with metric filters, the algorithm can find the LEDs position on the image. By estimating the distance of the center of the LED formation and the center of the image, the algorithm uses the camera calibration parameters to convert it to azimuth and elevation angles, allowing the structure to follow and target the laser on the aerial robot with a error of elevation = 4.96º and azimuth = 0.27º with a standard deviation of 1.03 and 0.51 degree. To call the aerial robot, the structure will perform a linear cut on the elevation motor, because of this, the azimuth precision is made much higher. On the aerial vehicle, using a 6 x 6 matrix of photodectors, a 0.5 x 0.5 m sensor was designed to be capable of estimating the position of the laser dot on the sensor at a rate of 200 Hz. The aerial robot will decode the sensor reading and acquire multiples instances of the readings, once possible, a linear approximation of the point collection is used to estimate a vector, the direction of the vector is estimated by looking at the older and newer points, resulting in a velocity vector which the aerial robot will follow. Once the vehicle is close enough, the structure will perform a cut in the opposite direction, commanding the aerial robot to stop and allowing for reliable autonomous landing. |
id |
USP_b64f56412d0d56d09747457d75f79352 |
---|---|
oai_identifier_str |
oai:teses.usp.br:tde-12052020-175028 |
network_acronym_str |
USP |
network_name_str |
Biblioteca Digital de Teses e Dissertações da USP |
repository_id_str |
2721 |
spelling |
2-DOF Laser Structure for Autonomous External Quadrotor GuidanceDispositivo autônomo com 2 graus de liberdade para guiar um quadrotoragricultura de precisãocomputer visioncooperação robóticaprecision agriculturequadrotorquadrotorrobot cooperationUAVVANTvisão computacionalThis dissertation, 2-DOF Laser Structure for Autonomous External Quadrotor Guidance, has the goal of designing a novel method to attract an aerial robot to a ground vehicle using a system composed of a camera and a laser for autonomous landing. This applications is designed to be used alongside other guiding methods. This is a complex problem that requires work on different fields on both the ground and the aerial robot. On the ground station, know as Mirã, a structure was designed using 3D printed parts, and two stepper motor adjusted for 1:16 microstepping to precisely orient a camera and a laser with 2 DoF. The motor is controlled by a computer vision algorithm that uses four green LED on the aerial robot. With the combination of a light source filter and a color filter along with metric filters, the algorithm can find the LEDs position on the image. By estimating the distance of the center of the LED formation and the center of the image, the algorithm uses the camera calibration parameters to convert it to azimuth and elevation angles, allowing the structure to follow and target the laser on the aerial robot with a error of elevation = 4.96º and azimuth = 0.27º with a standard deviation of 1.03 and 0.51 degree. To call the aerial robot, the structure will perform a linear cut on the elevation motor, because of this, the azimuth precision is made much higher. On the aerial vehicle, using a 6 x 6 matrix of photodectors, a 0.5 x 0.5 m sensor was designed to be capable of estimating the position of the laser dot on the sensor at a rate of 200 Hz. The aerial robot will decode the sensor reading and acquire multiples instances of the readings, once possible, a linear approximation of the point collection is used to estimate a vector, the direction of the vector is estimated by looking at the older and newer points, resulting in a velocity vector which the aerial robot will follow. Once the vehicle is close enough, the structure will perform a cut in the opposite direction, commanding the aerial robot to stop and allowing for reliable autonomous landing.Esta dissertação, Dispositivo autônomo com 2 graus de liberdade para guiar um quadrotor, tem como objetivo projetar um novo método para atrais um veículo aéreo para um veículo terrestre usando um sistema composto de uma câmera e um laser para pouso autônomo. Esta aplicação foi projetada para ser usada em conjunto com outros métodos de localização. Este é um problema complexo que requer para sua solução o uso de metodologias em diferentes campos, tanto no robô aéreo quanto no terrestre. No robô terrestre, conhecido como Miriã, uma estrutura foi projetada usando peças impressas em impressora 3D, dois motores de passo ajustados para 1:16 micropasso são usados para orientar com precisão uma câmera e um laser com 2 DoF. O motor é controlado por um algoritmo de visão computacional que usa quatro LEDs verdes no robô aéreo. Com a combinação de um filtro de luz e um filtro de cor auxiliados por um filtro métrico, o algoritmo pode encontrar a posição dos LEDs na imagem. Ao estimar a distância entre o centro do objeto formado pelos LEDs e o centro da imagem. O algoritmo converte o valor em ângulos de azimute e elevação usando os parâmetros de calibração da câmera, permitindo assim que a estrutura siga e direcione o laser no robô aéreo, com um erro de elevação = 4.96º e azimute = 0.27º com desvio padrão de 1.03 e 0.51 grau, respectivamente. Para chamar o robô aéreo, a estrutura executará um corte linear, ao mover o motor de elevação e, por este motivo, a precisão em azimute é feita maior. No veículo aéreo, usando uma matriz de 6 x 6 fotodetectores, foi concebido um sensor de 0.5 x 0.5 m, capaz de estimar a posição do laser no sensor a uma taxa de 200 Hz. O veículo aéreo decodifica a leitura do sensor e armazena múltiplas instâncias das leituras, então uma aproximação linear dos pontos obtidos é usada para estimar a direção de um vetor, o sentido é estimado observando os pontos mais antigos e os mais recentes, resultando assim em um vetor velocidade que é enviado para o veículo aéreo seguir. Uma vez que o veículo esteja próximo o suficiente, a estrutura realizará um corte na direção oposta, comandando a parada do veículo aéreo e permitindo uma aterrissagem autônoma confiável.Biblioteca Digitais de Teses e Dissertações da USPBecker, MarceloSoler, Diego Pavan2019-09-06info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/masterThesisapplication/pdfhttps://www.teses.usp.br/teses/disponiveis/18/18149/tde-12052020-175028/reponame:Biblioteca Digital de Teses e Dissertações da USPinstname:Universidade de São Paulo (USP)instacron:USPLiberar o conteúdo para acesso público.info:eu-repo/semantics/openAccesseng2020-05-20T21:39:01Zoai:teses.usp.br:tde-12052020-175028Biblioteca Digital de Teses e Dissertaçõeshttp://www.teses.usp.br/PUBhttp://www.teses.usp.br/cgi-bin/mtd2br.plvirginia@if.usp.br|| atendimento@aguia.usp.br||virginia@if.usp.bropendoar:27212020-05-20T21:39:01Biblioteca Digital de Teses e Dissertações da USP - Universidade de São Paulo (USP)false |
dc.title.none.fl_str_mv |
2-DOF Laser Structure for Autonomous External Quadrotor Guidance Dispositivo autônomo com 2 graus de liberdade para guiar um quadrotor |
title |
2-DOF Laser Structure for Autonomous External Quadrotor Guidance |
spellingShingle |
2-DOF Laser Structure for Autonomous External Quadrotor Guidance Soler, Diego Pavan agricultura de precisão computer vision cooperação robótica precision agriculture quadrotor quadrotor robot cooperation UAV VANT visão computacional |
title_short |
2-DOF Laser Structure for Autonomous External Quadrotor Guidance |
title_full |
2-DOF Laser Structure for Autonomous External Quadrotor Guidance |
title_fullStr |
2-DOF Laser Structure for Autonomous External Quadrotor Guidance |
title_full_unstemmed |
2-DOF Laser Structure for Autonomous External Quadrotor Guidance |
title_sort |
2-DOF Laser Structure for Autonomous External Quadrotor Guidance |
author |
Soler, Diego Pavan |
author_facet |
Soler, Diego Pavan |
author_role |
author |
dc.contributor.none.fl_str_mv |
Becker, Marcelo |
dc.contributor.author.fl_str_mv |
Soler, Diego Pavan |
dc.subject.por.fl_str_mv |
agricultura de precisão computer vision cooperação robótica precision agriculture quadrotor quadrotor robot cooperation UAV VANT visão computacional |
topic |
agricultura de precisão computer vision cooperação robótica precision agriculture quadrotor quadrotor robot cooperation UAV VANT visão computacional |
description |
This dissertation, 2-DOF Laser Structure for Autonomous External Quadrotor Guidance, has the goal of designing a novel method to attract an aerial robot to a ground vehicle using a system composed of a camera and a laser for autonomous landing. This applications is designed to be used alongside other guiding methods. This is a complex problem that requires work on different fields on both the ground and the aerial robot. On the ground station, know as Mirã, a structure was designed using 3D printed parts, and two stepper motor adjusted for 1:16 microstepping to precisely orient a camera and a laser with 2 DoF. The motor is controlled by a computer vision algorithm that uses four green LED on the aerial robot. With the combination of a light source filter and a color filter along with metric filters, the algorithm can find the LEDs position on the image. By estimating the distance of the center of the LED formation and the center of the image, the algorithm uses the camera calibration parameters to convert it to azimuth and elevation angles, allowing the structure to follow and target the laser on the aerial robot with a error of elevation = 4.96º and azimuth = 0.27º with a standard deviation of 1.03 and 0.51 degree. To call the aerial robot, the structure will perform a linear cut on the elevation motor, because of this, the azimuth precision is made much higher. On the aerial vehicle, using a 6 x 6 matrix of photodectors, a 0.5 x 0.5 m sensor was designed to be capable of estimating the position of the laser dot on the sensor at a rate of 200 Hz. The aerial robot will decode the sensor reading and acquire multiples instances of the readings, once possible, a linear approximation of the point collection is used to estimate a vector, the direction of the vector is estimated by looking at the older and newer points, resulting in a velocity vector which the aerial robot will follow. Once the vehicle is close enough, the structure will perform a cut in the opposite direction, commanding the aerial robot to stop and allowing for reliable autonomous landing. |
publishDate |
2019 |
dc.date.none.fl_str_mv |
2019-09-06 |
dc.type.status.fl_str_mv |
info:eu-repo/semantics/publishedVersion |
dc.type.driver.fl_str_mv |
info:eu-repo/semantics/masterThesis |
format |
masterThesis |
status_str |
publishedVersion |
dc.identifier.uri.fl_str_mv |
https://www.teses.usp.br/teses/disponiveis/18/18149/tde-12052020-175028/ |
url |
https://www.teses.usp.br/teses/disponiveis/18/18149/tde-12052020-175028/ |
dc.language.iso.fl_str_mv |
eng |
language |
eng |
dc.relation.none.fl_str_mv |
|
dc.rights.driver.fl_str_mv |
Liberar o conteúdo para acesso público. info:eu-repo/semantics/openAccess |
rights_invalid_str_mv |
Liberar o conteúdo para acesso público. |
eu_rights_str_mv |
openAccess |
dc.format.none.fl_str_mv |
application/pdf |
dc.coverage.none.fl_str_mv |
|
dc.publisher.none.fl_str_mv |
Biblioteca Digitais de Teses e Dissertações da USP |
publisher.none.fl_str_mv |
Biblioteca Digitais de Teses e Dissertações da USP |
dc.source.none.fl_str_mv |
reponame:Biblioteca Digital de Teses e Dissertações da USP instname:Universidade de São Paulo (USP) instacron:USP |
instname_str |
Universidade de São Paulo (USP) |
instacron_str |
USP |
institution |
USP |
reponame_str |
Biblioteca Digital de Teses e Dissertações da USP |
collection |
Biblioteca Digital de Teses e Dissertações da USP |
repository.name.fl_str_mv |
Biblioteca Digital de Teses e Dissertações da USP - Universidade de São Paulo (USP) |
repository.mail.fl_str_mv |
virginia@if.usp.br|| atendimento@aguia.usp.br||virginia@if.usp.br |
_version_ |
1815256894365761536 |