Sensor fusion for cooperative perception in intelligent transport systems

Detalhes bibliográficos
Autor(a) principal: Pinho, Hugo Emanuel Silva
Data de Publicação: 2023
Tipo de documento: Dissertação
Idioma: eng
Título da fonte: Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
Texto Completo: http://hdl.handle.net/10773/40214
Resumo: In a world that is becoming increasingly interconnected, autonomous driving systems necessitate sturdy communication platforms. Modern vehicles operate as machines and data centers that gather and distribute data from various sensors and devices to improve their cognitive processes. Accuracy, reliability, and adherence to communication protocols are crucial as cooperative systems increasingly rely on this information. Consider a scenario where an autonomous vehicle approaches an intersection and receives a message from another vehicle, indicating that it is slowing down to allow for the safe passage of the autonomous vehicle. Regrettably, due to incorrect information in the message, the autonomous vehicle assumes that the other vehicle has stopped and proceeds without reducing its speed, or in the same scenario, the infrastructure sends misleading information to the vehicle indicating a green traffic light at the intersection, while it is actually red. As a result, the autonomous vehicle crosses the intersection unsafely, disregarding the correct signal. These situations demonstrate the critical importance of data quality in cooperative perception and connected vehicle systems. Incorrect data, inaccuracies, or unsafe can lead to faulty decision-making by connected vehicles and infrastructures, endangering the safety of road users. The primary objective of this study is to overcome the challenges of object classification in cooperative perception messages transmitted by roadside units to ensure higher accuracy and reliability. Radars installed on roadside infrastructure accurately detect object positions and speed, but face difficulties in determining the type of object detected. This dissertation relies on image classification methods to address these challenges and to offer contextual information. This information is later integrated with the detected object data, utilizing an event-based sensor fusion model. This model incorporates the latest artificial intelligence algorithms and late fusion techniques to fuse data from various sensors. Late fusion was selected because it enables the utilization and integration of individually categorized data into a unified service. The study’s findings showcase the methodology’s effectiveness at diverse times of the day, which historically hindered the performance of roadside infrastructure systems, like camera systems. The fusion model significantly improved data classification during nighttime, radar comparison with fused data showed an improvement of 9.85%, while during the daytime, it increased by 23.56%. This study presents an effective approach to enhancing a radar’s comprehension of its surroundings, particularly when it lacks precise image classification systems. By implementing this novel model into existing infrastructures, one can greatly improve the performance of generated and transmitted messages. Therefore, this research confirms the feasibility of improving the data contained in the cooperative messages sent, promoting the advancement of cooperative driving technologies, and ensuring the promotion of safe connected vehicles and more effective cooperative intelligent transportation systems.
id RCAP_0ca15381373ebd437fd7cd93651da0cb
oai_identifier_str oai:ria.ua.pt:10773/40214
network_acronym_str RCAP
network_name_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository_id_str 7160
spelling Sensor fusion for cooperative perception in intelligent transport systemsCPMConnected vehiclesSensor fusionAutonomous vehiclesRSUITSC-ITSVehicular communicationsCollective perceptionImage classificationYOLOIn a world that is becoming increasingly interconnected, autonomous driving systems necessitate sturdy communication platforms. Modern vehicles operate as machines and data centers that gather and distribute data from various sensors and devices to improve their cognitive processes. Accuracy, reliability, and adherence to communication protocols are crucial as cooperative systems increasingly rely on this information. Consider a scenario where an autonomous vehicle approaches an intersection and receives a message from another vehicle, indicating that it is slowing down to allow for the safe passage of the autonomous vehicle. Regrettably, due to incorrect information in the message, the autonomous vehicle assumes that the other vehicle has stopped and proceeds without reducing its speed, or in the same scenario, the infrastructure sends misleading information to the vehicle indicating a green traffic light at the intersection, while it is actually red. As a result, the autonomous vehicle crosses the intersection unsafely, disregarding the correct signal. These situations demonstrate the critical importance of data quality in cooperative perception and connected vehicle systems. Incorrect data, inaccuracies, or unsafe can lead to faulty decision-making by connected vehicles and infrastructures, endangering the safety of road users. The primary objective of this study is to overcome the challenges of object classification in cooperative perception messages transmitted by roadside units to ensure higher accuracy and reliability. Radars installed on roadside infrastructure accurately detect object positions and speed, but face difficulties in determining the type of object detected. This dissertation relies on image classification methods to address these challenges and to offer contextual information. This information is later integrated with the detected object data, utilizing an event-based sensor fusion model. This model incorporates the latest artificial intelligence algorithms and late fusion techniques to fuse data from various sensors. Late fusion was selected because it enables the utilization and integration of individually categorized data into a unified service. The study’s findings showcase the methodology’s effectiveness at diverse times of the day, which historically hindered the performance of roadside infrastructure systems, like camera systems. The fusion model significantly improved data classification during nighttime, radar comparison with fused data showed an improvement of 9.85%, while during the daytime, it increased by 23.56%. This study presents an effective approach to enhancing a radar’s comprehension of its surroundings, particularly when it lacks precise image classification systems. By implementing this novel model into existing infrastructures, one can greatly improve the performance of generated and transmitted messages. Therefore, this research confirms the feasibility of improving the data contained in the cooperative messages sent, promoting the advancement of cooperative driving technologies, and ensuring the promotion of safe connected vehicles and more effective cooperative intelligent transportation systems.Num mundo cada vez mais conectado, os sistemas de condução autónoma exigem plataformas de comunicação robustas. Os veículos de hoje em dia não são apenas máquinas, são centros de dados, que recolhem e emitem informações de uma variedade de sensores para assim melhorar os seus processos de tomada de decisões. No entanto, a precisão, a fiabilidade e a adesão aos protocolos de comunicação são fundamentais à medida que os sistemas cooperativos se tornam mais dependentes destes dados. Imaginemos a situação em que um veículo autónomo se aproxima do cruzamento e recebe uma mensagem cooperativa de outro veículo que também se aproxima. A mensagem informa que o veículo vizinho está a reduzir a velocidade para permitir que o veículo autónomo passe com segurança. No entanto, devido a informações incorretas na mensagem transmitida, o veículo autónomo confia que o outro veículo já parou e continua sua trajetória sem diminuir a velocidade. No mesmo cenário, uma infraestrutura envia uma mensagem para o veículo com dados incorretos que indicam que o semáforo está verde para o cruzamento, quando na realidade está vermelho. Como resultado, o veículo autónomo atravessa o cruzamento de forma insegura, ignorando a sinalização luminosa correta. Essas situações demonstram como a qualidade dos dados é critica em sistemas de veículos conectados e na perceção cooperativa. Dados incorretos, imprecisos ou inseguros podem gerar decisões erróneas dos veículos conectados e infraestruturas, colocando em risco a segurança dos utilizadores da via. O principal objetivo desta investigação é abordar os desafios associados à classificação de objetos presentes nas mensagens de perceção cooperativa transmitidas pelos sensores de beira da estrada, de modo a garantir que estas mensagens contem informações mais precisas e fiáveis. Atualmente a classificação dos objetos é realizada pelos radares presentes na infraestrutura de beira da estrada, embora estes sejam excelentes na deteção de posições de objetos, velocidade, têm dificuldade em determinar o tipo do objeto detetado. De modo a mitigar estas dificuldades, este estudo utiliza técnicas avançadas de classificação de imagens para fornecer informações contextuais que depois serão fundidas com os dados dos objetos detetados. Para atingir este objetivo foi desenvolvido um modelo de fusão sensorial por eventos que incorpora técnicas de fusão tardia e os mais avançados algoritmos de inteligência artificial para a fusão dos dados de diversos sensores. Os resultados deste estudo demonstram a eficácia desta abordagem, em diferentes momentos do dia, que tradicionalmente desafiam o desempenho dos diversos sistemas presentes nas infraestruturas de beira da estrada. O modelo de fusão melhorou significativamente a classificação dos dados durante o período noturno, comparado as classificações do radar com os dados fundidos verificou-se que estas melhoraram cerca de 9,85%, já durante o dia estas tiverem um aumento de aproximadamente 23,56%. Esta investigação oferece uma solução promissora que permitirá os dados de um radar serem complementados com informação mais precisa e detalhada do ambiente envolvente, melhorando significativamente o desempenho das mensagens geradas e transmitidas. Contribuindo para a evolução contínua das tecnologias de condução cooperativa, garantindo veículos conectados mais seguros e sistemas de transporte inteligente cooperativos mais eficientes.2024-01-18T10:33:11Z2023-12-05T00:00:00Z2023-12-05info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/masterThesisapplication/pdfhttp://hdl.handle.net/10773/40214engPinho, Hugo Emanuel Silvainfo:eu-repo/semantics/openAccessreponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãoinstacron:RCAAP2024-02-22T12:18:47Zoai:ria.ua.pt:10773/40214Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireopendoar:71602024-03-20T03:10:19.199338Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãofalse
dc.title.none.fl_str_mv Sensor fusion for cooperative perception in intelligent transport systems
title Sensor fusion for cooperative perception in intelligent transport systems
spellingShingle Sensor fusion for cooperative perception in intelligent transport systems
Pinho, Hugo Emanuel Silva
CPM
Connected vehicles
Sensor fusion
Autonomous vehicles
RSU
ITS
C-ITS
Vehicular communications
Collective perception
Image classification
YOLO
title_short Sensor fusion for cooperative perception in intelligent transport systems
title_full Sensor fusion for cooperative perception in intelligent transport systems
title_fullStr Sensor fusion for cooperative perception in intelligent transport systems
title_full_unstemmed Sensor fusion for cooperative perception in intelligent transport systems
title_sort Sensor fusion for cooperative perception in intelligent transport systems
author Pinho, Hugo Emanuel Silva
author_facet Pinho, Hugo Emanuel Silva
author_role author
dc.contributor.author.fl_str_mv Pinho, Hugo Emanuel Silva
dc.subject.por.fl_str_mv CPM
Connected vehicles
Sensor fusion
Autonomous vehicles
RSU
ITS
C-ITS
Vehicular communications
Collective perception
Image classification
YOLO
topic CPM
Connected vehicles
Sensor fusion
Autonomous vehicles
RSU
ITS
C-ITS
Vehicular communications
Collective perception
Image classification
YOLO
description In a world that is becoming increasingly interconnected, autonomous driving systems necessitate sturdy communication platforms. Modern vehicles operate as machines and data centers that gather and distribute data from various sensors and devices to improve their cognitive processes. Accuracy, reliability, and adherence to communication protocols are crucial as cooperative systems increasingly rely on this information. Consider a scenario where an autonomous vehicle approaches an intersection and receives a message from another vehicle, indicating that it is slowing down to allow for the safe passage of the autonomous vehicle. Regrettably, due to incorrect information in the message, the autonomous vehicle assumes that the other vehicle has stopped and proceeds without reducing its speed, or in the same scenario, the infrastructure sends misleading information to the vehicle indicating a green traffic light at the intersection, while it is actually red. As a result, the autonomous vehicle crosses the intersection unsafely, disregarding the correct signal. These situations demonstrate the critical importance of data quality in cooperative perception and connected vehicle systems. Incorrect data, inaccuracies, or unsafe can lead to faulty decision-making by connected vehicles and infrastructures, endangering the safety of road users. The primary objective of this study is to overcome the challenges of object classification in cooperative perception messages transmitted by roadside units to ensure higher accuracy and reliability. Radars installed on roadside infrastructure accurately detect object positions and speed, but face difficulties in determining the type of object detected. This dissertation relies on image classification methods to address these challenges and to offer contextual information. This information is later integrated with the detected object data, utilizing an event-based sensor fusion model. This model incorporates the latest artificial intelligence algorithms and late fusion techniques to fuse data from various sensors. Late fusion was selected because it enables the utilization and integration of individually categorized data into a unified service. The study’s findings showcase the methodology’s effectiveness at diverse times of the day, which historically hindered the performance of roadside infrastructure systems, like camera systems. The fusion model significantly improved data classification during nighttime, radar comparison with fused data showed an improvement of 9.85%, while during the daytime, it increased by 23.56%. This study presents an effective approach to enhancing a radar’s comprehension of its surroundings, particularly when it lacks precise image classification systems. By implementing this novel model into existing infrastructures, one can greatly improve the performance of generated and transmitted messages. Therefore, this research confirms the feasibility of improving the data contained in the cooperative messages sent, promoting the advancement of cooperative driving technologies, and ensuring the promotion of safe connected vehicles and more effective cooperative intelligent transportation systems.
publishDate 2023
dc.date.none.fl_str_mv 2023-12-05T00:00:00Z
2023-12-05
2024-01-18T10:33:11Z
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/masterThesis
format masterThesis
status_str publishedVersion
dc.identifier.uri.fl_str_mv http://hdl.handle.net/10773/40214
url http://hdl.handle.net/10773/40214
dc.language.iso.fl_str_mv eng
language eng
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv application/pdf
dc.source.none.fl_str_mv reponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron:RCAAP
instname_str Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron_str RCAAP
institution RCAAP
reponame_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
collection Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository.name.fl_str_mv Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
repository.mail.fl_str_mv
_version_ 1799137753716228096