Welcome to the upgraded MacSphere! We're putting the finishing touches on it; if you notice anything amiss, email macsphere@mcmaster.ca

Direct estimation of forest aboveground biomass from UAV LiDAR and RGB observations in forest stands with various tree densities

dc.contributor.advisorGonsamo, Alemu
dc.contributor.authorSo, Kangyu
dc.contributor.departmentEarth and Environmental Sciencesen_US
dc.date.accessioned2025-07-21T17:49:30Z
dc.date.available2025-07-21T17:49:30Z
dc.date.issued2025
dc.description.abstractCanada’s vast forests play a substantial role in the global carbon balance but require laborious and expensive forest inventory campaigns to monitor changes in aboveground biomass (AGB). Light detection and ranging (LiDAR) or reflectance observations onboard airborne or unoccupied aerial vehicles (UAV) may address scalability limitations associated with traditional forest inventory but require simple forest structures or large sets of manually delineated crowns. Here, we introduce a deep learning approach for crown delineation and AGB estimation reproducible for complex forest structures without relying on hand annotations for training. Firstly, we detect treetop and delineate crowns with LiDAR point cloud using marker-controlled watershed segmentation (MCWS). Then we train a deep learning model on annotations derived from MCWS to make crown predictions on an UAV red, blue and green (RGB) tiles. Finally, we estimate AGB metrics from tree height and crown diameter-based allometric equations, all derived from UAV data. We validate our approach using a 14-ha mixed forest stands with various experimental tree densities in Southern Ontario, Canada. Our results demonstrate an 18% improvement in AGB estimation accuracy when the unsupervised LiDAR only algorithm is followed by a self-supervised RGB deep learning model. In unharvested stands, the self-supervised RGB model performs well for height (R^2=0.79) and AGB (R^2= 0.80) estimation. In thinned stands, the performance of both unsupervised and self-supervised methods varied with stand density, crown clumping, canopy height variation, and species diversity. These findings suggest that MCWS can be supplemented with self-supervised deep learning to directly estimate biomass components in complex forest structures as well as atypical forest conditions where stand density and spatial patterns are manipulated.en_US
dc.description.degreeMaster of Science (MSc)en_US
dc.description.degreetypeThesisen_US
dc.description.layabstractThe effects of forest thinning practices on biomass regeneration are not well understood as traditional field methods for measuring forest characteristics are costly and impractical for large spatial extents. To monitor and report on biomass components more effectively, we used unoccupied aerial vehicle (UAV) imagery and laser scanning observations, segmentation algorithms, and a deep learning predictive model, for a 14-ha mixed forest stand in Southern Ontario. Laser scanning observations were segmented into tree crowns for the deep learning model, and the crown size, height, and biomass of individual trees were output from UAV imagery. Our results indicate that a combined segmentation and modelling approach can provide accurate estimates of biomass components in forests, even under conditions where their stand density and spatial patterns are manipulated.en_US
dc.identifier.urihttp://hdl.handle.net/11375/32011
dc.language.isoenen_US
dc.subjectLiDARen_US
dc.subjectUAVen_US
dc.subjectbiomassen_US
dc.subjectunmanned aerial vehicleen_US
dc.subjectcrown delineationen_US
dc.subjectself-supervised deep learningen_US
dc.titleDirect estimation of forest aboveground biomass from UAV LiDAR and RGB observations in forest stands with various tree densitiesen_US
dc.typeThesisen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
So_Kangyu_M_202506_MSc.pdf
Size:
1.7 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.68 KB
Format:
Item-specific license agreed upon to submission
Description: