Please use this identifier to cite or link to this item:
http://hdl.handle.net/11375/29905
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | von Mohrenschildt, Martin | - |
dc.contributor.advisor | Habibi, Saeid | - |
dc.contributor.author | Sochaniwsky, Adrian | - |
dc.date.accessioned | 2024-06-27T14:17:55Z | - |
dc.date.available | 2024-06-27T14:17:55Z | - |
dc.date.issued | 2024 | - |
dc.identifier.uri | http://hdl.handle.net/11375/29905 | - |
dc.description.abstract | Intelligent Transportation Systems are advanced technologies used to reduce traffic and increase road safety for vulnerable road users. Real-time traffic monitoring is an important technology for collecting and reporting the information required to achieve these goals through the detection and tracking of road users inside an intersection. To be effective, these systems must be robust to all environmental conditions. This thesis explores the fusion of camera and Light Detection and Ranging (LiDAR) sensors to create an accurate and real-time traffic monitoring system. Sensor fusion leverages complimentary characteristics of the sensors to increase system performance in low- light and inclement weather conditions. To achieve this, three primary components are developed: a 3D LiDAR detection pipeline, a camera detection pipeline, and a decision-level sensor fusion module. The proposed pipeline is lightweight, running at 46 Hz on modest computer hardware, and accurate, scoring 3% higher than the camera-only pipeline based on the Higher Order Tracking Accuracy metric. The camera-LiDAR fusion system is built on the ROS 2 framework, which provides a well-defined and modular interface for developing and evaluated new detection and tracking algorithms. Overall, the fusion of camera and LiDAR sensors will enable future traffic monitoring systems to provide cities with real-time information critical for increasing safety and convenience for all road-users. | en_US |
dc.language.iso | en | en_US |
dc.subject | computer vision | en_US |
dc.subject | LiDAR | en_US |
dc.subject | object detection | en_US |
dc.subject | multi-object tracking | en_US |
dc.subject | intelligent transportation systems | en_US |
dc.subject | sensor fusion | en_US |
dc.title | A LIGHTWEIGHT CAMERA-LIDAR FUSION FRAMEWORK FOR TRAFFIC MONITORING APPLICATIONS | en_US |
dc.title.alternative | A CAMERA-LIDAR FUSION FRAMEWORK | en_US |
dc.type | Thesis | en_US |
dc.contributor.department | Computing and Software | en_US |
dc.description.degreetype | Thesis | en_US |
dc.description.degree | Master of Applied Science (MASc) | en_US |
dc.description.layabstract | Accurate traffic monitoring systems are needed to improve the safety of road users. These systems allow the intersection to “see” vehicles and pedestrians, providing near instant information to assist future autonomous vehicles, and provide data to city planers and officials to enable reductions in traffic, emissions, and travel times. This thesis aims to design, build, and test a traffic monitoring system that uses a camera and 3D laser-scanner to find and track road users in an intersection. By combining a camera and 3D laser scanner, this system aims to perform better than either sensor alone. Furthermore, this thesis will collect test data to prove it is accurate and able to see vehicles and pedestrians during the day and night, and test if runs fast enough for “live” use. | en_US |
Appears in Collections: | Open Access Dissertations and Theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
sochaniwsky_adrian_r_2024june_masc.pdf | 6.35 MB | Adobe PDF | View/Open |
Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.