Please use this identifier to cite or link to this item:
http://hdl.handle.net/11375/30084
Title: | Sensor Fusion of LiDAR and Camera for Lane Detection with Instance Segmentation and GPS Integration using Kalman Filtering for Target Tracking on Snowy Roads |
Other Titles: | Sensor Fusion for Lane Detection On Snow Covered Roads |
Authors: | Hidajat, Severin |
Advisor: | Emadi, Ali |
Department: | Electrical and Computer Engineering |
Keywords: | Sensor Fusion;Lane Detection;Autonomous Vehicle;Instance Segmentation;GPS Integration;Kalman Filtering;Camera;LiDAR |
Publication Date: | 2024 |
Abstract: | This thesis explores advanced sensor fusion techniques to enable robust lane detection for autonomous vehicles, even in adverse weather conditions like heavy snowfall. Autonomous driving technology has progressed rapidly in recent years, with the integration of various driver assistance features such as Lane Centering Assistance (LCA), Lane Keeping Assistance (LKA), and Lane Departure Warning (LDW). These systems rely heavily on accurate lane marking detection to maintain the vehicle's position within the lane and provide timely alerts to the driver. However, in snowy conditions, the visibility and reliability of these visual lane markers can be severely compromised, posing a significant challenge for current autonomous driving systems. Therefore, this research investigates the integration of data from multiple sensor modalities, including cameras, LiDAR, and GPS, to enable precise lane tracking even with environmental obstructions. By fusing these complementary sensors, the system can maintain accurate lane detection, enabling enhanced performance of lane-related assistance features and contributing to safer and more robust navigation for autonomous driving. In addition to the sensor fusion approach, the thesis explores novel methodologies, such as infrared imaging and ground penetrating radar, to improve autonomous navigation in complex environments. These innovative techniques provide alternative sensing capabilities that can complement the camera and LiDAR data, enhancing the overall robustness and adaptability of the autonomous driving system. The detailed annotation and model training presented in this work underscores the potential for these methods to significantly enhance autonomous driving systems, particularly in adverse weather conditions. The research paves the way for safer and more effective navigation in complex driving environments, addressing a critical challenge in the advancing autonomous vehicle technology. By developing robust and adaptable lane detection solutions, this thesis contributes to the goal of enabling autonomous vehicles to operate reliably year-round, regardless of the weather conditions. |
Description: | The goal of this thesis project is to enhance the safety and reliability of self-driving vehicles, particularly in areas prone to snow and harsh weather. By employing camera, LiDAR, and GPS sensor fusion, instance segmentation, and Kalman filtering, this research seeks to overcome the limitations of current lane detection systems in adverse conditions, laying the foundation for more resilient and adaptable autonomous driving solutions capable of confidently navigating complex environments. |
URI: | http://hdl.handle.net/11375/30084 |
Appears in Collections: | Open Access Dissertations and Theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
HIDAJAT_Severin_K_202408_MASc.pdf | 15.86 MB | Adobe PDF | View/Open |
Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.