A Comparison Of Drone-Based SfM And Drone-Based Lidar For .

2y ago
24 Views
2 Downloads
4.16 MB
29 Pages
Last View : 2m ago
Last Download : 3m ago
Upload by : Macey Ridenour
Transcription

COLLEGE OF ENGINEERINGSchool of Civil andConstruction EngineeringA Comparison of Drone-Based SfM and Drone-Based Lidar forDense Topographic Mapping ApplicationsChase Simpson, MS, EIT, FSInstructor of GeomaticsOregon State University

MotivationUAS for Topographic Mapping Large number of surveyors and geospatial professionals using, or interestedin using, UAS for topographic mapping– Reported that UAS can reduce person-hours in surveying by up to 60% (van Rees,2018) 2 main methods of topographic mapping from UAS1. Structure from Motion (SfM) Multi View Stereo (MVS) software applied to droneimagery2. Light detection and ranging (lidar) on UAS Confusing and contradictory information on which is “better”2

Background on Remote Sensing SystemsActive SensorsUAS-lidarRemote sensor emittingan energy source andmeasuring return strengthand travel time Lidar, radar, etc. Invariant to ambientlighting conditions Typically moreexpensive33

Background on Remote SensingPassive SensorsRemote sensor collectsenergy reflected off ofan object from anexternal sourceUAS-SfM Cameras– RGB, NIR, IR Reliant on ambientlighting conditions &environmental factors Lower cost4

Background on SfMWhat is Structure From Motion Relatively new photogrammetric approach– Leverages advanced image matching algorithms from thefield of computer vision Many requirements are relaxed, as compared withconventional photogrammetry:– Can work with a wide range of viewing geometries andconsumer-grade cameras– Well suited to UAS imagery!– Highly automated, easy to use software: Agisoft Metashape, Pix4D, etc. Typically consists of two different steps:– Image matching & recovery of camera parameters (SfM)– Dense reconstruction (MVS)5

Specific Goals of ResearchProvide information of each platform to aid professionals in selecting the mostadvantageous technique based on project requirements.a) Quantitative assessment of UAS-lidar and UAS-SfM in comparison withterrestrial lidar and high accurate check pointsb) Qualitative assessment along the following dimensions:i.ii.iii.iv.CostSystem complexityLearning curveRemote aircraft payload requirements6

Project LocationPort of Skamania – Stevenson, WASummary:– Large elevation gradients– Tree Canopy– Multiple Surface TypesArea: 8 Acres7

Port of SkamaniaSurface Characteristics Complex Variety ofSurface types–––––GrassConcreteBare earthGravelAsphalt8

Reference DatasetControl Survey: Acquisition Purpose:- Control for terrestrial lidar & UAS data- Provide check points for vertical accuracy assessmentfollowing ASPRS standards:- All check points (CP’s) on flat or uniformly slopedopen terrain- Minimum number of 20 points for each surfacetypeAsphalt:Bare Earth:Grass:43 points32 points25 points Estimated uncertainties at a 95% Confidencelevel reported from least square adjustment:Horizontal:Vertical:1.3 cm2.4 cm9

Reference DataTerrestrial Lidar: Acquisition Purpose:– Provide reference as truth when visuallycomparing UAS datasets Summary:–––––Lidar Scanner:Leica P40Mounted GNSS:Leica GS14 utilizing ORGNAcquired density: 1 cm @ 30 m from scanner10 total scan positions usedPerformed manual ground classificationBefore Manual ClassificationAfter Manual ClassificationRMSE at 95% Confidence level- Horizontal:1.1 cm- Vertical:1.2 cm10

UAS-LidarPlatform Specs- DJI M600 Pro w/ A3 flight controller- Phoenix Lidar System AL3-32:- Lidar Scanner: Velodyne HDL 32- GNSS aided INS: NovAtel KVH1725- Camera:Sony A6000Other Phoenix SystemsCost: 100,000Note:This platform is median cost of UAS-lidarplatforms available. Cost increases substantiallyfor increased accuracyAll images acquired from:https://www.phoenixlidar.com/11

UAS-LidarData Acquisition Mission Planning:– Software: Phoenix Lidar SpatialExplorer Altitude:180 feet AGLSidelap:75%Flying speed: 8m/sPlanned Point density: 150 points/m2Multiple Returns:(first/last) Data Processinga) NovAtel Inertial Explorer Process Trajectory informationb) Phoenix Aerials Inertial Explorer Combines lidar and processed trajectoryc) TerraSolid Suite (TerraMatch/TerraScan) Maximizes relative accuracy between flightlines Point Cloud Classification12

UAS-SfMPlatform Specs Airframe:– DJI S900– Pixhawk Flight Controller Camera:– Sony A6300 (24 MP)– 30mm lens– Fixed mount Positioning system:– Piksi Multi GNSS receiver (GPS GLO)– Dual Frequency Helical GNSS Antenna– Records a time stamp for each acquired imageCost: 4,000(COTS: 7,500- 15,000)13

UAS-SfMData Acquisition Mission Planning:––––––Software: Ardupilot Mission PlannerAltitude:377 feet AGLPlanned GSD: 1.5 cmSidelap:80%Overlap:80%Flying Speed: 5 m/s Camera Parameters:––––––File Format:Shutter speed:Aperture:ISO [min max]:Focus mode:White Balance:Raw1/1250F5.6[100 400]Auto (center)Fixed14

UAS-SfMData Processing: OverviewSoftware: RTKLIB– Process Trajectory information– GPS only & GPS GLONASS MATLAB– Creates .csv file with coordinatesof aircraft for each image– Applies median filter to densepoint cloud(s) Agisoft PhotoscanStudy completedbefore the rebrandingto “Metashape”– Imagery alignment– Dense point cloud creation– Point cloud classification15

UAS-SfMData Processing: Structure from Motion Processing Workflow:- Held constant for all datasets- Based on USGS recommended workflowMask outwater fromimageryImport DataAdjustCameraAccuraciesFor GCP sensitivityanalysis:1 GCP vs. 5 GCP(s)Align Photos(sparsecloud)OptimizecamerasExport DenseSfM GroundPoints (.las)ApplyMedian Filter(MATLAB)Median filteredpointcloud ionClassifyGroundPointsCreateDense PointCloudMedian Filter Summary:- Binning algorithm- 5 cm x 5 cm bins- Reduce noise- Decrease point density16

Summary of Resulting DataPoint CloudsQualitative AssessmentAverage PointDensity (pts/m2)Point CloudAverage PointSpacing (cm)Terrestrial lidar (reference data set)70001.2UAS-lidarUAS-SfM raw50550014.51.3UAS-SfM grid3505.4Terrestrial LidarUAS-lidarUAS-SfM rawUAS-SfM grid17

Qualitative AssessmentSfM Median Filter Visualized using Cloud Compare Benefits:– Noise visualization0.050.00Standard Deviation of bin (m)0.10 Key contributions to noise:– Poor illumination– Poor Texture– Decrease in overlap and/orsidelap from large verticalobstructions– Combinations of above18

Qualitative AnalysisPoint Cloud Comparisons - ProfilesProfiles showing importance of texture & lighting for SfMUAS SfM RawTerrestrial lidar(reference data)UAS lidarUAS SfM RawTerrestrial lidar(reference data)UAS lidar19

Qualitative AnalysisPoint Cloud ComparisonsUAS SfMRawTerrestrial lidarUAS SfMGridUASlidarGround TruthAsphaltsPoor TextureFlat/Hard surfaceBare EarthGood textureFlat/Soft surfaceRough surfaceGrassGood textureRough surfaceDense Vegetation20

Qualitative ComparisonSummary of Advantages & DisadvantagesUAS-LidarUAS-SfMlidar (active)RGB camera (passive)Can penetrate canopyyesnoReliant on surface texturenoyesReliant on lighting conditionsnoyesVariables of each data pointposition & intensityposition & RGBposition and orientationposition onlyCosthighlowAcquisition timelowlowOperational expertise requiredhighmoderateProcessing expertise requiredmoderatelowhighmoderateDemand on computing resourcesmoderatehighPoint density using typicalacquisition parameters (pts/m2)30-250350-5500Sensor typeRequirement for georeferencingUser-input processing time21

Quantitative AnalysisVertical Accuracy Assessment II UAS-SfM vs UAS-lidarWhat’s being compared?VSUAS-SfMSummary: 4kPassive sensorGeoreferenced imagery1 GCPMedian FilteredUAS-lidarSummary: 100kActive sensorDirectly georeferenced1 GCP22

Quantitative AnalysisVertical Accuracy Assessment II UAS-SfM vs UAS-lidarCompared to ground control– Asphalt:43 points– Bare Earth: 32 points– Grass:25 pointsSummary:UAS SfM(Median Filter)UAS LidarTerrestrial Lidar(Reference Data)RMSE at 95% confidence level UAS-lidar excelled over UASSfM on poor textured surfaces UAS-SfM performed similarlyto UAS-lidar on bare earth Both techniques performedpoorly on grass whencompared to the referencedataset.23

Conclusions/RecommendationsUAS-SfM should be default system*– Low cost– Easy implementation/processing– Comparable accuracies in many circumstancesUAS-lidar should be implemented when any of thesecharacteristics are present:––––Homogenous surface textureCanopy/vegetation penetration is requiredPoor illuminationLarge quantity of vertical obstructions* Except when it shouldn’t24

QUESTIONS?More details of this study can be found in my MastersThesis accessed through the valley Library at OSUAccess to Full Report

Addition Content(NOT INCLUDED IN PRESENTATION DUE TO TIME CONSTRAINTS)26

Quantitative AnalysisVertical Accuracy Assessment I SfM ProcessingWhat’s being compared: GNSS constellations:– Does using multiple constellations providea more accurate trajectory than using asingle constellation?GPS only versus GPS GLONASS GCP sensitivity analysis:– How many GCPs should be usedwhen imagery is georeferenced?– 1 GCP versus 5 GCPs– More is probably better, buthow much better is it?“I’m not really sure ”27

Quantitative AnalysisExpected ResultsVertical Accuracy Assessment ISfM ProcessingSummary:- Bias approached reference datawhen more GCPs were used- Using multiple constellations forthe trajectory improved the resultsin most casesBiasMeasured ResultsStandard Deviation28

Quantitative AnalysisVertical Accuracy Assessment II UAS-SfM vs UAS-lidarSummary:- UAS-lidar excelled over UAS-SfMon asphalts- UAS-SfM performed similarly toUAS-lidar on bare earth- Both techniques performedpoorly on grass when comparedto the reference dataset.UAS SfM(Median Filter)UAS LidarTerrestrial Lidar(Reference Data)29

1. Structure from Motion (SfM) Multi View Stereo (MVS) software applied to drone imagery 2. Light detection and ranging (lidar) on UAS Confusing and contradictory information on which is “better” 2 UAS for Topographic Mapping

Related Documents:

YUNEEC Typhoon H3 Industrial drone RtF Camera drone User Guide Box Contents . M900 4 Axis Industrial Long Range Drone, M900, 4 Axis Industrial Long Range Drone, Industrial Long Range Drone, Long Range Drone, Range Drone, Drone, 4 Axis Drone Related Posts Leave a comment Your email address will not be published. Required fields are marked *

En mode caméra de drone, active le flou des objets au premier plan. VUE DU DESSUS DRONE Place le drone en vue de dessus. VERROUILLAGE DRONE DE LA CIBLE SUIVANTE Verrouille la caméra de drone sur la cible suivante. La caméra de drone suivra les mouvements de cette nouvelle cible. VERROUIILLAGE DRONE DE LA CIBLE PRÉCÉDENTE

Quantitative Analysis of the German Drone Market Market Study by Drone Industry Insights 12 The German Drone Market: European comparison Commercial market in European comparison How big is the market demand in European drone markets? Absolute market size Relative market size per worker 1. Germany 1. Norway 2. France 2. Switzerland 3. UK 3 .

The Splash drone 3 The Splash drone 3 is the most advanced water resistant (1) drone ever released. Its the first fully integrated modular amphibious flying platform. Thanks to its new flight control system, its also easier to fly. Its brand new propulsion sy

Procurement of a standard drone type is not mandatory. Structures Managers are currently utilising the DJI Phantom 4 Professional for bridge inspections and can be contacted for advice on drone performance and reliability. 2.3 Drone Kits Where drones are registered on the ARTC Ellipse asset register, they should be registered as a Drone Kit.

a Demirev V.,2017, "Drone Detection in Urban Environment - The New Challenge for the Radar Systems Designers", International Scientific Journal "Security & Future", WEB ISSN 2535-082X, Year 1 Issue 3, pp 114-116. . kinetic means, electronic warfare (EW) and cyber warfare (CW) techniques, Drone vs drone, and Directed Energy Weapons (DEW)

On-drone Sensing: In the design of FALCON, first, we realize target-to-drone range sensing for networked drones. Unlike existing on-drone sensing systems that either require bulky antenna arrays and perform time-consuming AoA com-putation or employ RSSI which is only coarsely related to distance, drones in FALCON accurately and quickly range

same as the maximum distance between the operator and the drone (or better). It al-ways depends on the transmitter power of the drone/operator. Depending on the drone type it could be several km / miles without problems (e.g. the typical detection range for a standard DJI Phantom III is around 3km).