SLAM-BASED MAPPING FOR OBJECT RECOGNITION

2y ago
20 Views
2 Downloads
1.21 MB
76 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Lee Brooke
Transcription

SLAM-BASED MAPPING FOR OBJECT RECOGNITIONLOH WAN YINGA project report submitted in partial fulfilment of therequirements for the award of the degree ofBachelor of Engineering (Hons) Electronic EngineeringFaculty of Engineering and Green TechnologyUniversiti Tunku Abdul RahmanMay 2018

iiDECLARATIONI hereby declare that this project report is based on my original work except for citationsand quotations which have been duly acknowledged. I also declare that it has not beenpreviously and concurrently submitted for any other degree or award at UTAR or otherinstitutions.Signature :Name:LOH WAN YINGID No.:13AGB01857Date:04 May 2018

iiiAPPROVAL FOR SUBMISSIONI certify that this project report entitled “SLAM-BASED MAPPING FOR OBJECTRECOGNITION” was prepared by LOH WAN YING has met the required standard forsubmission in partial fulfilment of the requirements for the award of Bachelor ofEngineering (Hons) Electronic Engineering at Universiti Tunku Abdul Rahman.Approved by,Signature:SupervisorDr. YAP VOOI VOON:Date:Signature:Dr. HUMAIRA NISARCo-Supervisor :Date:

ivThe copyright of this report belongs to the author under the terms of the copyrightAct 1987 as qualified by Intellectual Property Policy of University Tunku Abdul Rahman.Due acknowledgement shall always be made of the use of any material contained in, orderived from, this report. 2018, Loh Wan Ying. All right reserved.

vSpecially dedicated tomy beloved supervisor, co-supervisor, mother and father

viACKNOWLEDGEMENTSI would like to thank everyone who had contributed to the successful completion of thisproject. First and foremost, I would like to express my gratitude to my research supervisor,Dr. Yap Vooi Voon for his invaluable advice, guidance and his enormous patiencethroughout the development of the research.In addition, I would also like to express my gratitude to my research co-supervisor,Dr. Humaira Nisar for her guidance and care throughout the FYP. She has been motivatingand encouraging by constantly sharing insightful comments and thoughts from time totime.Furthermore, I would also like to express my gratitude to my loving parent andfriends who had helped and given me encouragement.

viiSLAM-BASED MAPPING FOR OBJECT RECOGNITIONABSTRACTThe aim of this project is to map an unknown environment, autonomously navigate to the2D navigation goal set by user and recognize object placed in object database by using acustom made differential-drive mobile robot that works under the Robot OperatingSystem (ROS) framework. The concept of deploying the robot in search and rescuemission, is being implemented so that the efficiency of search and rescue mission can beimproved at a lower cost. The custom made robot is able to navigate in an unknownenvironment and feedback sensory data from Kinect Xbox 360 and odometry data to PC.Therefore, it is important for the robot to feedback a reliable and accurate odometry dataefficiently so that the robot is able to localize itself in the unknown environment. Theproject architecture includes a personal laptop, a Kinect Xbox 360 sensor, the custommade robot and Arduino Mega 2560. The personal laptop acts as the command centerwhere the Simultaneous Localization and Mapping (SLAM) algorithm are run byreceiving odometry data from Arduino on the custom made robot. A USB connection isestablished between the Arduino, custom made robot and PC. After a map of the unknownenvironment is built, the Adaptive Monte Carlo Localization (AMCL) is used to localizethe robot and Dijkstra’s algorithm is deployed to compute the shortest path to thedestination goal. The SIFT (Scale-Invariant Feature Transform) is used to extract featuresfrom the current frame and match with the object database to identify and recognize theobject whenever the robot come across the object. The location of object can also beobtained in respect to the location of Kinect sensor by using 3x3 Homography matrix.Implementation of project has been carried out successfully and the custom made robot isable to map and recognize object accurately.

viiiTABLE OF CONTENTSDECLARATIONiiAPPROVAL FOR SUBMISSIONiiiACKNOWLEDGEMENTSviABSTRACTviiTABLE OF CONTENTSviiiLIST OF TABLESxivLIST OF FIGURESxviiLIST OF SYMBOLS / ABBREVIATIONSxxiiiLIST OF APPENDICESxxvCHAPTER12INTRODUCTION11.1Project Overview11.2Problem Statements21.3Aims and Objectives3LITERATURE REVIEW42.1Mobile Robot Navigation42.1.15Localization and Mapping

ix2.1.1.1 Dead Reckoning52.1.1.2 Simultaneous Localization and7Mapping (SLAM)2.1.1.3 MonteCarloLocalization14(MCL)2.1.2Path Planning2.1.2.1 Dijkstra’s17ShortestPath17Algorithm2.1.2.2 A Star (A*) Algorithm192.2GMapping Algorithm212.3RGB-D Simultaneous Localization and Mapping23(SLAM)2.4Robot Operating System (ROS)252.5Object Detection and Recognition262.5.127SURF (Speeded Up Robust Features)Algorithm2.5.2SIFT (Scale-Invariant Feature Transform)29Algorithm2.6Summary for Robot Navigation (Localization and31Mapping)2.7Summary of Robot Navigation (Path Planning)322.8Summary of Object Recognition33

x3METHODOLOGY343.1Design Specifications343.2Hardware Specifications363.3Software Requirements373.4Robot Design383.5Indoor Mapping413.5.142Simultaneous Localization and MappingGMapping (SLAM-GMapping)Algorithm3.5.23.6ORB SLAM2 Algorithm43Practical Implementation of SLAM-GMapping44using Custom-made Differential Drive Robot3.6.1Control of Custom-made tion and Mapping (SLAM)3.6.3AutonomousNavigationand2D52Implement of SLAM-GMapping Algorithm by54Navigation Goal3.7using ROS Simulation3.7.1Simulation of Robot Model and Gazebo55Environment for ROS3.7.2Simulation of Sensors in Gazebo553.7.3Simulation of ROS Navigation Stack56

xi3.8Implementation of RGBD SLAM563.9Implementation of Find Object 2D Algorithm for57Object Detection and Recognition3.10Camera Calibration on Intrinsic Parameters and59Lens of Kinect nted FAST and Rotated BRIEF SimultaneousLocalization and Mapping)3.11.1Setting up Environment for Raspberry Pi613 and ORB-SLAM23.11.2Interfacing ROS across Two Machines623.11.3Implementation of ORB-SLAM2 for63Indoor Mapping43.12Cost Analysis653.13Project’s Sustainability663.14Gantt Chart673.14.1Gantt Chart FYP I673.14.2Gantt Chart FYP II68RESULTS AND DISCUSSIONS694.169Preliminary Work4.1.1Robot’s Specifications69

xii4.1.2Range Detection towards Flat Surface76(Wall) by using Kinect Xbox 360 sensor4.1.3Range Detection towards Curved Surface78(Bag) by using Kinect Xbox 360 sensor4.1.4RangeDetectiontowardsDistance80between Objects by using Kinect Xbox360 sensor4.1.5Intrinsic Calibration of Kinect Xbox 36081sensor4.2Practical Implementation of SLAM-GMapping87using Custom-made Differential Drive Robot4.3Implement of SLAM-GMapping Algorithm by101using ROS Simulation4.4Relationship between Practical gorithm4.5Implementation of RGBD-SLAM1154.6Implementation of Find Object 2D Algorithm for117Object Detection and iented FAST and Rotated BRIEF SimultaneousLocalization and Mapping)126

oduction1325.2Review1325.2.1Design of Robot1335.2.2SLAM Algorithm1335.2.3Object Recognition1335.3Conclusion1345.4Future Works135REFERENCES136APPENDICES146

xivLIST OF TABLESTABLETITLEPAGE2.1Summary for Robot Navigation in Localization andMapping312.2Summary for Robot Navigation in Path Planning322.3Summary for Robot Navigation in Object Recognition333.1Relationship between Key Buttons on Keyboard andDirection of Movement463.2Output of Swap Space of SD Card Connected toRaspberry Pi 3613.3Cost analysis of the equipment and materials used653.4Project’s sustainability analysis in terms on hardwareand software664.1Relationship between the motor’s setting and themovement of the robot704.2Total angle of rotation of robot when wheel’s diameterand track width are set at 8 cm and 23.7 cmrespectively724.3Total distance travelled of robot when wheel’sdiameter and track width are set at 8 cm and 23.7 cmrespectively734.4Total angle of rotation of robot when wheel’s diameterand track width are set at 8 cm and 24.7 cmrespectively74

xv4.5Total distance travelled of robot when wheel’sdiameter and track width are set at 8 cm and 24.7 cmrespectively754.6The relationship between the actual distance of the wallfrom the robot and the range measured by Kinect Xbox360 sensor towards the wall from the robot (cm)774.7The relationship between the actual distance of thecurved object from the robot and the range measuredby Kinect Xbox 360 sensor towards the curved objectfrom the robot (cm)794.8The relationship between the actual distance betweenobjects and Kinect sensor reading for the distancebetweenobjects (cm)804.9The relationship between the default intrinsicparameters and calibrated intrinsic parameters of RGBcamera and IR camera864.10Specifications of robot and other parameters used894.11Relationship between equations used and data fromROStopics914.12Specifications of joints and links of Kinect Xbox 360sensor1034.13Relationship of parent link and child link of specificjoints and links1034.14ROS topics to be published by using plugin ofKinect camera controller1034.15Distortion model specified for camera lens of Kinectsensor1044.16Relationship between time taken to map for practicalimplementation and ROS simulation1094.17Results that show the relationship between the actualnumber of objects and error of root mean square in bothpractical implementation and simulation112

xvi4.18Results that show the relationship between the actualdistance to the 2D navigation goal and error of rootmean square in both practical implementation andsimulation1144.19Results that show the relationship between the actualnumber of objects and error of root mean square in bothRTAB-Map and RVIZ1174.20Number of features that can be extracted from thegroundtruth image and image of object detected1194.21Position of four corners of object 10, 12, 13 and 18detected in the image in terms of image pixels1224.22Relationship between height and weight ofgroundtruth and object detected1244.23Relationship and setup of ROS across host machineand slave machine1264.24Relationship between default and calibrated intrinsicparameters for USB camera1274.25Single pose of camera and its format131

xviiLIST OF FIGURESFIGURETITLEPAGE2.1Dead Reckoning of the Robot (Zhenjun, Nisar and Malik,2014)62.2An overview of SLAM process integrated with ExtendedKalman Filter, EKF (Riisgaard and Blas, 2004)92.3The triangle is a representation of robot. The stars arerepresentation of landmarks. The lightning arerepresentation of location of landmarks based onmeasurement of sensors (Riisgaard and Blas, 2004)92.4The robot estimates its current position and odometryprovides distance travelled by robot (Riisgaard and Blas,2004)102.5Sensors are used to measure the location of landmarkrelative to position of robot but it does not match with thelocation provided odometry data. Thus,

(Wall) by using Kinect Xbox 360 sensor 76 4.1.3 Range Detection towards Curved Surface (Bag) by using Kinect Xbox 360 sensor 78 4.1.4 Range Detection towards Distance between Objects by using Kinect Xbox 360 sensor 80 4.1.5 Intrinsic Calibration of Kinect Xbox 360

Related Documents:

Bruksanvisning för bilstereo . Bruksanvisning for bilstereo . Instrukcja obsługi samochodowego odtwarzacza stereo . Operating Instructions for Car Stereo . 610-104 . SV . Bruksanvisning i original

concept mapping has been developed to address these limitations of mind mapping. 3.2 Concept Mapping Concept mapping is often confused with mind mapping (Ahlberg, 1993, 2004; Slotte & Lonka, 1999). However, unlike mind mapping, concept mapping is more structured, and less pictorial in nature.

Object built-in type, 9 Object constructor, 32 Object.create() method, 70 Object.defineProperties() method, 43–44 Object.defineProperty() method, 39–41, 52 Object.freeze() method, 47, 61 Object.getOwnPropertyDescriptor() method, 44 Object.getPrototypeOf() method, 55 Object.isExtensible() method, 45, 46 Object.isFrozen() method, 47 Object.isSealed() method, 46

Object Class: Independent Protection Layer Object: Safety Instrumented Function SIF-101 Compressor S/D Object: SIF-129 Tower feed S/D Event Data Diagnostics Bypasses Failures Incidences Activations Object Oriented - Functional Safety Object: PSV-134 Tower Object: LT-101 Object Class: Device Object: XS-145 Object: XV-137 Object: PSV-134 Object .

10 tips och tricks för att lyckas med ert sap-projekt 20 SAPSANYTT 2/2015 De flesta projektledare känner säkert till Cobb’s paradox. Martin Cobb verkade som CIO för sekretariatet för Treasury Board of Canada 1995 då han ställde frågan

service i Norge och Finland drivs inom ramen för ett enskilt företag (NRK. 1 och Yleisradio), fin ns det i Sverige tre: Ett för tv (Sveriges Television , SVT ), ett för radio (Sveriges Radio , SR ) och ett för utbildnings program (Sveriges Utbildningsradio, UR, vilket till följd av sin begränsade storlek inte återfinns bland de 25 största

Hotell För hotell anges de tre klasserna A/B, C och D. Det betyder att den "normala" standarden C är acceptabel men att motiven för en högre standard är starka. Ljudklass C motsvarar de tidigare normkraven för hotell, ljudklass A/B motsvarar kraven för moderna hotell med hög standard och ljudklass D kan användas vid

LÄS NOGGRANT FÖLJANDE VILLKOR FÖR APPLE DEVELOPER PROGRAM LICENCE . Apple Developer Program License Agreement Syfte Du vill använda Apple-mjukvara (enligt definitionen nedan) för att utveckla en eller flera Applikationer (enligt definitionen nedan) för Apple-märkta produkter. . Applikationer som utvecklas för iOS-produkter, Apple .