FEASIBILITY COMPARISON OF AIRBORNE LASER SCANNING DATA AND 3D-POINT CLOUDS FORMED FROM UNMANNED AERIAL VEHICLE (UAV)-BASED IMAGERY USED FOR 3D PROJECTING

DOI: 10.24057/2414-9179-2017-3-23-31-46

View or download the article (Rus)

About the Authors

I. I. Rilskiy

Lomonosov Moscow State University
Russian Federation
Faculty of Geography; 119991, Moscow, Leninskie Gory,1

I. V. Kalinkin

Lomonosov Moscow State University
Russian Federation
Faculty of Geography; 119991, Moscow, Leninskie Gory,1

Abstract

New, innovative methods of aerial surveys have changed the approaches to information provision of projecting dramatically for the last 15 years. Nowadays there are at least two methods that claim to be the most efficient way for collecting geospatial data intended for projecting – the airborne laser scanning (LIDAR) data and photogrammetrically processed unmanned aerial vehicle (UAV)-based aerial imagery, forming 3D point clouds. But these materials are not identical to each other neither in precision, nor in completeness.

Airborne laser scanning (LIDAR) is normally being performed using manned aircrafts. LIDAR data are very precise, they allow us to achieve data about relief even overgrown with vegetation, or to collect laser reflections from wires, metal constructions and poles. UAV surveys are normally being performed using frame digital cameras (lightweight, full-frame, or mid-size). These cameras form images that are being processed using 3D photogrammetric software in automatic mode that allows one to generate 3D point cloud, which is used for building digital elevation models, surfaces, orthomosaics, etc.

All these materials are traditionally being used for making maps and GIS data. LIDAR data have been popular in design work. Also there have been some attempts to use for the same purpose 3D-point clouds, formed by photogrammetric software from images acquired from UAVs.

After comparison of the datasets from these two different types of surveying (surveys were made simultaneously on the same territory), it became possible to define some specific, typical for LIDAR or imagery-based 3D data. It can be mentioned that imagery-based 3D data (3D point clouds), formed in automatic mode using photogrammetry, are much worse than LIDAR data – both in terms of precision and completeness.

The article highlights these differences and makes attempts at explaining the origin of these differences. 

Keywords

LIDAR, airborne laser scanning, unmanned aerial vehicle (UAV), photogrammetric processing, 3D pointc loud, aerial imagery

References

  1. Blaschke T. 3D landscape metrics to modeling forest structure and diversity based on laser scanning data, The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, Freiburg, 2004, Vol. XXXVI-8/W2, pp. 129–132.
  2. Burnett C., Blaschke T. A multi-scale segmentation/object relationship modeling methodology for landscape analysis, Ecological Modeling, 2003, 168, pp. 233–249.
  3. Kumar V. Extraction of forest inventory parameters and 3D modelling from airborne LIDAR, Materials of ESRI international user conference, San Diego, 2014, 342–351.
  4. Schwarz B. LIDAR: Mapping the world in 3D, Nature Photonics, 2010, No 4, pp. 429–430.
  5. Tiede D. et al. A full GIS-based workflow for tree identification and delineation using laser scanning, The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, Vienna, 2005, Vol. XXXVI, Part 3/W24, pp. 9–14.
  6. Wehr A., Lohr U. Airborne Laser scanning – an introduction and overview, ISPRS Journal of Photogrammetry and Remote Sensing, 1999, No 54, pp. 68–82.
  7. Zimble D.A. et al. Characterizing vertical forest structure using small-footprint airborne LIDAR, Remote Sensing of the Environment, 2003, No 87, pp. 171–82.

For citation: Rilskiy I.I., Kalinkin I.V. FEASIBILITY COMPARISON OF AIRBORNE LASER SCANNING DATA AND 3D-POINT CLOUDS FORMED FROM UNMANNED AERIAL VEHICLE (UAV)-BASED IMAGERY USED FOR 3D PROJECTING. Proceedings of the International conference “InterCarto. InterGIS”. 2017;23(3):31–46 DOI: 10.24057/2414-9179-2017-3-23-31-46 (in Russian)