Cellular Communication-Based Autonomous UAV Navigation With . - INASS

1y ago
11 Views
2 Downloads
782.16 KB
9 Pages
Last View : 3m ago
Last Download : 3m ago
Upload by : Azalea Piercy
Transcription

Received: October 23, 2020.Revised: January 14, 2021.344Cellular Communication-Based Autonomous UAV Navigation withObstacle Avoidance for Unknown Indoor EnvironmentsYeon Ji Choi11I Nyoman Apraz Ramatryana1Soo Young Shin1*Department of IT Convergence Engineering, Kumoh National Institute of Technology,61 Daehak-ro, Gumi, Gyeongbuk, 39177, South Korea* Corresponding author’s Email: wdragon@kumoh.ac.krAbstract: The technology of Unmanned Aerial Vehicles (UAV) has been the subject of pioneering research in recentyears. Also, the subject has been used in various situations for both outdoor and indoor. In this paper, a cellularcommunication-based autonomous UAV Navigation is proposed that enables UAVs to maneuver independently inpreviously unknown and GPS-denied indoor environments with LiDAR sensors. The main idea of the proposedscheme is to implement LTE connection to UAV Navigation and compared it to a Wi-Fi connection. We have proposedthat autonomous UAV navigation system for unknown environment like indoor. This system relies on the combinationof ROS-based Hector SLAM systems, 2D-LiDAR sensor and LTE connection. The performance of proposed schemeis evaluated as the error distance and exploration time for unknown area like indoor environment. The mappingefficiency of LTE connection is 57.5% greater than the Wi-Fi connection. Computation time for flight is approximately200 seconds.Keywords: Autonomous UAV, Navigation, SLAM, Obstacle avoidance, LiDAR sensor, LTE connection, Unknownindoor environment.1. IntroductionIn many fields such as air defense, precisionagriculture, militaries, smart transport, searches andrescue operations and many other sectors, unmannedaerial vehicles (UAV) have become substantiallyestablished in the scientific world. Also, The UAVshave been used in various environments for bothoutdoor and indoor, including reconnaissance,surveillance, saving lives in disaster situations,indoor positioning and utilization in logisticswarehouses. There has been considerable progress inthe autonomous navigation of these systems. Theglobal positioning system (GPS) is used for externalautonomous navigation [1]. Because GPS signals aretypically absent or weak indoors, autonomousnavigation is difficult [2].There are various approaches for independentindoor navigation which have been proposed inrecent years. In most of them, laser range detectors(LiDAR) [3], RGB-D sensors [4], or stereo vision [5]are used to create a 3D map of an unseen field, whichhelps to locate the unit at all times. Simultaneouslocalization and mapping (SLAM) technology is usedby the unit to construct an indoor map in real time andsimultaneously locate its own position on the map. Apoint cloud map of the indoor environment is used forSLAM technology. The cloud map can be applied toseveral fields, such as surveying and mapping, whichsignificantly improves the effectiveness of buildingindoor maps.However, it is an intensively computationalmethod in most real-time scenarios and thus fails.When a UAV acquires a map and compute scenariosin real-time simultaneously various positioning andmappings algorithms have been used for theoreticalanalysis and practical applications in the fields ofrobotics [6]. This is challenging point for solvingpractical these problems. The challenge of selectingan algorithm involves comparing the relevantmethods and choosing the appropriate one. The mostsignificant problem is that the SLAM algorithm isInternational Journal of Intelligent Engineering and Systems, Vol.14, No.2, 2021DOI: 10.22266/ijies2021.0430.31

Received: October 23, 2020.Revised: January 14, 2021.typically difficult to set up in a private setting. Due tothis problem, a modular architecture has beendeveloped for running robot applications using therobot operating system (ROS). The concept of theROS indicates to create a machine that could be usedby modifying a small amount of code on manyrobotic systems. The ROS also helps researcherssimulate easily and perform real experimentseffectively. The ROS application programminginterface is provided by most SLAM libraries. Themost commonly used LiDAR SLAM libraries areGmapping [7], Google Cartographers [8] and HectorSLAM [9].Previous reports [10, 11] include trajectorycomparisons, which are measured via LiDAR-basedSLAM trajectories with data obtained using variousvisible sensors (stereo and monocular cameras) andlaser (depth sensors). An analysis [12] was performedon five laser 2D SLAM techniques usable in ROS fordata obtained in real-life testing, using the followingmaps created with measurements based on localneighborhoods:HectorSLAM,Gmapping,KartoSLAM, CoreSLAM and LagoSLAM. Inanother study, three 2D SLAM methods provided bythe ROS were compared: Gmapping, Hector SLAMand CRSM SLAM [13]. These studies motivated usto support and improve the 2D SLAM methodicalstudy of the Gmapping, Hector SLAM and GoogleCartographers with [12] average nearest neighbordistance (ANND) metrics. Accordingly, this paperproposes cellular-communication-based connectionfor an autonomous UAV with a LiDAR sensor andHector SLAM. The communication has decided aslong term evolution (LTE) which is cellular-basedconnection.The rest of the work is organized as Section 2details the related work, while section 3 explains thesystem model and implementation. Section 4includes the experimental results and discussion.Section 5 wraps the work with a conclusion andfuture work.2. Related worksIn previous studies on autonomous UAVnavigation, a 3D map of the local area has constructed.These techniques are used in certain situations to mapprecise trajectories for quadcopters [14, 15].However, they are based on a sophisticated controlregime and thus their use is limited to laboratorysettings [16, 17, 18]. In other methods, the map islearned from a manual path and quadcopters fly alongthe same path [19]. GPS-based posing projections areused for most outdoor flights that are not as accurateindoor scenarios.345All approaches employ small size sensors, suchas infrared, RGB-D or laser spectrum sensors [4]. Foran autonomous navigation systems, a singleultrasonic sensor is used with an infrared sensor [20].The LiDAR and inertial measurement unit (IMU)state assessment approach indicate stable operation inan unknown environment that are refuted by a GPS[21]. The problem with range sensors is that they areheavy and their power consumption is high and thusthey are not ideal for most UAVs.The SLAM technique uses various opticalsensors for generating a 3D map [16-18] with thelocation of a UAV at any point on the map [22]. A3D map of an uncertain indoor scenario is used forthe SLAM using a laser rangefinder [23]. Indoornavigation with a single camera is achieved using theSLAM technique [24, 25]. However, the majordownside of SLAM is that the regeneration of the 3Dmap area is complex, requiring considerably highprecision measurement and resource usage, asadditional sensors are necessary.In the case of real-time navigation, SLAM canalso generate communication delays and small sizemaps [26, 27]. Moreover, SLAM is primarily afeature-based device and its performance is poor forindoor materials, such as walls/roofs, because of itsinadequate differential strength. In a hallwayconsisting of walls, roofs and floors, the SLAMtechnology cannot achieve a desirable navigationalefficiency.3. System model and implementationNavigation is a technique for controller anddeciding moving paths and direction from aenvironment map for localization. There are threetypes of navigation: map building-based, map-basedand map-less. In the present study, map building-basemethod have used regarding the unknownenvironment in this paper. It allows UAVs to createtheir own maps from sensing information throughSLAM. The occupancy map divides the space into agrid of a certain size. The free space for flying UAVis distinguished from the space occupied by obstaclesFigure. 1 System modelInternational Journal of Intelligent Engineering and Systems, Vol.14, No.2, 2021DOI: 10.22266/ijies2021.0430.31

Received: October 23, 2020.Revised: January 14, 2021.or structures. This section presents theimplementation setup and environment. An indoorenvironment is selected as the test location as themost realistic and representative situation. Thesystem model of proposed SLAM recovery is shownin Fig. 1.3.1 Hector SLAM algorithmHector SLAM 1 is a 2D SLAM system [9] thatincorporates robust LiDAR scan matching and 3Dnavigation solution, including an expanded Kalmanfilter (EKF) inertial sensing system. Hector SLAM isbased on onboard of actual computations. The sixdegree of freedom (6DOF) robot holds highlyupdated LiDAR-based 2D imaging in real-time whilein motion. The Gaussian–Newton optimizationmethod provides the laser beam alignment endpointswith an obtained map, where all prior scans indirectlyfit.The "𝑀𝑜𝑣𝑒 𝑏𝑎𝑠𝑒" package has used in this study.The "𝑀𝑜𝑣𝑒 𝑏𝑎𝑠𝑒" node is an important element ofthe ROS navigation stack. This package enables theUAV to move from its current position to the targetposition. After the path planning is complete, the usercan employ the "2𝐷 𝑛𝑎𝑣 𝑔𝑜𝑎𝑙" in RVIZ to specifythe goal position (x,y) and orientation (x,y), whichwill then be published in "𝑚𝑜𝑣𝑒 𝑏𝑎𝑠𝑒/𝑔𝑜𝑎𝑙" topic.For "𝑀𝑜𝑣𝑒 𝑏𝑎𝑠𝑒" , a global planner and localnavigation planner are available. The global plannercalculates a safe route for the UAV to reach the targetlocation. When the UAV starts flying and scans as itsmoving, the global planner direction is determined.The map is submitted to the local planner until theglobal planner has prepared the journey. Consideringthe product of all the laser readings, the local planersplits the route into parts, thereby providing the UAVwith speed controls to follow the local direction.In this study, A algorithm was used as the globalplanner and the "𝑏𝑎𝑠𝑒 𝑙𝑜𝑐𝑎𝑙 𝑝𝑙𝑎𝑛𝑛𝑒𝑟" was used asthe trajectory rollout algorithm to calculate the pathas a local planner. The A algorithm can be applied toa grid-type map and the system must recognizesurrounding rectangles based on the starting point.Table 1 and Fig. 2 shows the process of A algorithm.If there are no obstacles, there are eight surroundingpoints which is shown as rectangles. The systeminputs the starting point and surrounding rectanglesinto Openset. At this time, the starting point or centerpoint is the “Parent” of other Openset. Then, thestarting point at Openset is removed and input intothe “Visited” so that it does not need to be checked1346again. In this situation, one of the “Children” (withthe smallest f value) is selected and the foregoingprocess is repeated.f g h,(1)Here, g represents the distance from the startingpoint to the current point and h represents the distancefrom the current point to the goal point.The distance from the starting point to thesurrounding Openset is 10 and 14. The top, bottom,left and right value is 10 and the diagonal value is 14from the starting point. The smallest value is assignedas 10. For h, it measures only in a direction other thandiagonal and checks the distance between the goaland current points without considering the obstacles.Finally, f is addition of g and h. The rectangle withthe smallest f among the Opensets is selected,removed from the Openset and input into “Visited”.Additionally, the items in “Visited” (except forobstacles) are placed into Openset and the center isinput into the “Parent”.The next rectangle is selected and the surroundingrectangles are input into Openset, while ensuring thatthere is a rectangle with a smaller g value. If so, theoriginal rectangle is input into “Visited” andconverted to a rectangle with a small g value.Figure. 2 A * algorithmTable 1. Process of A * algorithm1𝑠𝑡 step2𝑛𝑑 step𝑠𝑡Openset center,1 center,surrounding 8surrounding 8rectanglesrectanglesParent centercenterOpenset1𝑠𝑡 surrounding 8rectangles,surrounding 8rectangles2𝑛𝑑 surrounding 8rectanglesVisited1𝑠𝑡 center,center2𝑛𝑑 centerhector slam: http://wiki.ros.org/hector slamInternational Journal of Intelligent Engineering and Systems, Vol.14, No.2, 2021DOI: 10.22266/ijies2021.0430.31

Received: October 23, 2020.Revised: January 14, 2021.347Furthermore, the remainder of the neighborhood isinput into Openset. If the destination enters Opensetduring this repetition, the process can be terminated.Subsequently, the UAV can follow a straight pathfrom the destination, referring to “Visited”.3.2 System setupThe proposed strategy was validated via real-timeexperiments using the Bebop 2.0 drone. Bebop wasselected because of its small size. It can maneuver indense environments and carry sensor loads. The listof equipment required for the experiment is shown inTable 2. The Jetson TX2 PC has high computationalpower which can use a 2D LiDAR sensor. The UAVis equipped with this PC and demonstrated in Fig. 3(c). At the last, stated laptop is used as a groundcontrol station (GCS).The network architecture of the system ispresented in Fig. 3 (a). The UAV is connected withcomponents such as LTE modem, Jetson TX2 andLiDAR sensor. The communication between GCSand UAV is used with wireless systems such as WiFi based and LTE based. All the algorithms wereimplemented in Python. ROS Kinetic version wasused as middleware. The latest version of ROS Bebopautonomy2 was adopted.(a)(b)3.3 LTE connectionIn this proposed, the UAV can support with LTEconnection with softmod. First, the LTE modemattaches to the Parrot Bebop 2 as shown in Fig. 3 (c).Then, configuration is set up to the drone to hijack themode of Wi-Fi connection to LTE connection. Withonly Wi-Fi connection (without using LTE), theUAV can fly a maximum distance of 10m. In thisproposed, LTE was used because if the connectionbetween the UAV and the laptop is lost, the systemwill stop working. LTE provides greater coveragethan Wi-Fi. The Wi-Fi handover scheme forautonomous UAVs was not considered in this study.Additionally, LTE allows multiple UAVs to connectto the GCS. The connection between the GCS andmultiple UAVs is based on a ZeroTier server. Fig. 4(a) illustrates LTE connection between UAV andGCS and Fig. 4 (b) presents the connection betweenthe GCS and multiple UAVs.3.4 Lidar scanning systemThe LiDAR sensor can detect a wider area thanthe camera and is unaffected by light and weather.2(c)Figure. 3: (a) Network architecture, (b) reference axis androtation direction for UAV maneuver, and (c)configurationTable 2. Equipment usedDeviceModel nameCompanyLidar sensorRPLiDAR S1SlamtecUAVBebop drone 2ParrotPC on-boardJetson TX2NvidiaCarrier boardAuvidiea j120AuvidieaGCSThinkPad T580LenovoLTE modemLTE USB StickHuaweiThis characteristic makes UAVs are applicable inboth indoor and outdoor environments. The LiDARsensor used in this study has a measurement range of360 . Each of the raw laser points is represented inthe polar coordinate system as {(𝑑𝑖 , 𝜃𝑖 ); 0 𝑖 359}, where di represents the distance from the est/International Journal of Intelligent Engineering and Systems, Vol.14, No.2, 2021DOI: 10.22266/ijies2021.0430.31

Received: October 23, 2020.Revised: January 14, 2021.348Algorithm 1: SLAM for obstacle avoidance(a)(b)Figure. 4: (a) Parrot bebop 2 over 4G/LTE (softmod) and(b) zerotier network for multi .20.21.Result: Avoid obstacle with least space consumptioninitialization;generate map and do localization;set goal point;determine obstacle distance value;while map navigation is generated docheck path planning;if path planning from SLAM is available thenfollow generated path;elsenavigation system error;Update obstacle position and distance;for obstacle distance r meters docheck obstacle angle;If 0º obstacle range θº thenrotateleft;elserotateright;endendend22. endAlgorithm 1. SLAM for obstacle avoidanceof the UAV to the object and θi represents the relativeangle of the measurement. The data received by theLiDAR sensor are stored as a vector (di θi) andchecked to convert infinity scan values. The infinityscan values indicate that there is no obstacle from theray to the maximum range value that can be measuredby the LiDAR. Additionally, any object located at themaximum range value (dmax) is neglected. In a realtime situation, (dmax) do not get infinite value out ofthe operating range from object. The maximum valueof di (dmax) is 10m for RPLiDAR S1 LiDAR sensor,as depicted in Fig. 5.3.5 Indoor environmentFigure. 5 RPLiDAR S1 scanning systemFor the indoor environment, we select a room anda corridor which are shown in Fig. 6. The area isarranged to operate UAV navigation path planning.3.6 Obstacle avoidanceIf path planning is impossible because oftemporary errors, the LiDAR sensor continues toattempt obstacle recognition. If an obstacle isrecognized, evasive maneuvers are performedthrough obstacle avoidance control. The pseudo codeis followed in Algorithm 1.(a)(b)Figure. 6 Study room and corridor in kumoh nationalinstitute of technologyInternational Journal of Intelligent Engineering and Systems, Vol.14, No.2, 2021DOI: 10.22266/ijies2021.0430.31

Received: October 23, 2020.Revised: January 14, 2021.4. Implementation resultObstacle detection in SLAM using based on anaccurate map, because inaccurate recognition of theoccupied space can cause inefficient evasivetrajectory planning. Accordingly, a fast and accuratedetection system is required to overcome these issues.The flight results of UAVs during a system error areshown in Fig. 7 (a), indicating the planning path thatappears after the target point is selected. However,the UAV may not fly to the planned route because ofsystem error which is shown in Fig. 7 (b).(a)(b)Figure. 7: (a) Path planning and (b) UAV’s movement fornavigation system error(a)(b)(c)Figure. 8: (a) The obstacle for avoidance experiment, (b)obstacle position, and (c) UAV’s movement for obstacleavoidance349In Fig. 8 (a), it shows an experiment environmentfor verification of the obstacle avoidance algorithm.Also, Fig. 8 (b) presents the location value ofobstacles based on the LiDAR scan information. Themovement of UAV for obstacle avoidance usingAlgorithm 1 is shown in Fig. 8 (c). The UAV rotatedin the z-direction of Fig. 3 (b).The UAV is equipped with a LiDAR sensorproducing a 2D point cloud as a metric representationof the environment. A map is updated based on thepoint cloud in real-time, while the UAV explores thearea (see Fig. 9 (a)). While flying in the environment,the UAV can determine its position in a global mapFigure. 9 Left to right and top to bottom: (a) environmentmapping, (b) localization on the map, (c) path planningwhich updated in real-time, and (d) path planning whichupdated map(a)(b)Figure. 10 Graphical representation of real-time mapproduced by sensor data: (a) goal point a (b) goal point band c(a)(b)Figure. 11: (a) UAV’s pose and (b) navigation goalInternational Journal of Intelligent Engineering and Systems, Vol.14, No.2, 2021DOI: 10.22266/ijies2021.0430.31

Received: October 23, 2020.Revised: January 14, 2021.frame, as shown in Fig. 9 (b). Path planning isperformed using SLAM to navigate from its currentlocation to the goal point after crossing an obstacle,as depicted in Fig. 9 (c). As shown in Fig. 9 (d), pathplanning is performed using a previously generatedmap.The results have been demonstrated thatnavigation mapping and time connected with Wi-Fiand LTE. In Fig. 10, it is presented that a real-timemap produced by sensor data on the current positionusing Wi-Fi, which flies through a corridor (a) at asecond point (b) and out of the corridor (c) to reachthe next point. It is indicated that the axis, blue lineand green line are associated with the current location,the flight path of UAV and planned path generated atthe designating destination.These points (a), (b) and (c) are also mentioned inFig. 11. The real-time position of the x-axis and yaxis of the UAV are shown in Fig. 11 (a) and thenavigation goal point which is detected by SLAM atthe time is demonstrated in Fig. 11 (b). In thedemonstration, the UAV pause on (a) position at 70seconds because of system booting time. Whennavigation goal point (a) is expressed at (3.1,-0.5),real position is detected at (3.5, -0.5). At the next goalpoint (b) is designated at (5.8, -1.2), real position isdetected at (6, -1.5). The last goal point is at (3.8, 3.4),UAV is detected at (3.9, 3.6) taking 115 secondsthough whole route. An average error rate of positionis around (0.23, 0.16).(a)350In this paper, we have proposed UAV navigationusing LTE rather than Wi-Fi because of mappingrange. An improved mapping is obtained using theproposed which is presented in Fig. 12 (b) on theexperimental environment. The mapping capabilityof LTE based is 57.5% greater than the Wi-Fi basedas shown in Fig. 12 (a). This experiment issummarized in Table 3 on execute time of taking-offand landing. We assume that this map is an unknownarea, exploration time is represented as 196 seconds.The exploration time of the UAV based Wi-Fiwas 60 seconds, but a mapping from GCS using WiFi is smaller than using LTE. So it is unnecessary tocheck and compare exploration time between Wi-Fibased and LTE based.5. ConclusionIn this paper, we have proposed that a cellularcommunication-based UAV navigation with obstacleavoidance for unknown indoor environment. Theperformance of the proposed system is demonstratedin indoor scenarios, considering the UAV movement.It includes obstacle avoidance, UAV pose andnavigation goal perspective. In the system, LTE isused for achieving cellular-communication-basedconnection. Using LTE, wide range of mapping isobtained by UAV on the indoor environment. Whencellular communication is used as connectionbetween UAV and GCS, the system has stableconnectivity. In the experiment, LTE based increases57.5% mapping coverage compared with Wi-Fibased. Despite Wi-Fi handover for autonomous UAVis not considered in this paper, LTE based hasbenefits such as multiple connection between UAVand GCS and connectivity stability.Conflicts of InterestThe authors declare no conflict of interest.Author Contributions(b)Figure. 32 Result of map using hector SLAM: (a) Wi-Fibased and (b) LTE basedTable 3. Time of the SLAM navigation using LTENo.𝐓𝐚𝐤𝐞 18.95“Conceptualization, Yeon Ji Choi and I NyomanApraz Ramatryana; methodology, Yeon Ji Choi;software, Yeon Ji Choi; validation, Yeon Ji Choi;writing-original draft preparation, Yeon Ji Choi;writing-review and editing, I Nyoman AprazRamatryana; supervision, Soo Young Shin”;Acknowledgments“This research was supported by the MSIT(Ministry of Science and ICT), Korea, under theGrand Information Technology Research tional Journal of Intelligent Engineering and Systems, Vol.14, No.2, 2021DOI: 10.22266/ijies2021.0430.31

Received: October 23, 2020.Revised: January 14, 2021.supervised by the IITP (Institute for Informationcommunications Technology Planning Evaluation)”.References[1] J. Kwak and Y. Sung, “Autonomous UAV FlightControl for GPS-Based Navigation”, IEEEAccess, Vol. 6, pp. 37947-37955, 2018.[2] Y. Li and C. Shi, “Localization and Navigationfor Indoor Mobile Robot Based on ROS”, In:Proc. of 2018 Chinese Automation Congress(CAC), Xi'an, China, pp. 1135-1139, 2018.[3] Chen, Derek, and Grace Xingxin Gao,“Probabilistic graphical fusion of LiDAR, GPS,and 3D building maps for urban UAVnavigation”, Navigation, Vol. 66, No. 1, pp.151168, 2019.[4] B. Wu, X. Ge, L. Xie, and W. Chen, “Enhanced3D Mapping with an RGB-D Sensor viaIntegration of Depth Measurements and ImageSequences”, Photogrammetric Engineering &Remote Sensing, Vol. 85, No. 9, pp. 633-642,2019.[5] X. Lin, J. Wang, and C. Lin, “Research on 3DReconstruction in Binocular Stereo VisionBased on Feature Point Matching Method”, In:Proc. of 2020 IEEE 3rd International Conf. onInformation Systems and Computer AidedEducation (ICISCAE), Dalian, China, pp. 551556, 2020.[6] Panchpor, A. Aishwarya, Sam Shue, and JamesM. Conrad, “A survey of methods for mobilerobot localization and mapping in dynamicindoor environments”, In: Proc. of 2018 Conf.on Signal Processing and CommunicationEngineering Systems (SPACES), Vijayawada,India, pp. 138-144, 2018.[7] H. Wang, M. Huang, and D. Wu, “AQuantitative Analysis on Gmapping AlgorithmParameters Based on Lidar in Small AreaEnvironment”, In: Proc. of Chinese IntelligentSystems Conf. Springer, Singapore, pp. 480-492,2019.[8] W. Hess, D. Kohler, H. Rapp, and D. Andor,“Real-time loop closure in 2d lidar slam”, In:Proc. of 2016 IEEE International Conf. onRobotics and Automation (ICRA), Stockholm,Sweden, pp. 1271-1278, 2016.[9] P. Vanicek and L. Beran, “Navigation ofrobotics platform in unknown spaces usingLIDAR, Raspberry PI and hector slam”, Journalof Fundamental and Applied Sciences, Vol. 10,No. 3S, pp. 494-506, 2018.[10] I. Z. Ibragimov and I. M. Afanasyev,“Comparison of ROS-based visual SLAM351methods in homogeneous indoor environment”,In: Proc. of 2017 14th Workshop on Positioning,Navigation and Communications (WPNC),Bremen, Germany, pp. 1-6, 2017.[11] M. Filipenko and I. Afanasyev, “Comparison ofvarious slam systems for mobile robot in anindoor environment”, In: Proc. of 2018International Conf. on Intelligent Systems (IS),Funchal, Madeira Island, pp. 400-407, 2018.[12] J. M. Santos, D. Portugal, and R. P. Rocha, “Anevaluation of 2D SLAM techniques available inrobot operating system”, In: Proc. of 2013 IEEEInternational Symposium on Safety, Security,and Rescue Robotics (SSRR), Linkoping,Sweden, pp. 1-6, 2013.[13] M. Rojas-Fernandez, D. Mujica-Vargas, M.Matuz-Cruz, and D. Lopez-Borreguero,“Performance comparison of 2D SLAMtechniques available in ROS using a differentialdrive robot”, In: Proc. of 2018 InternationalConf. on Electronics, Communications andComputers (CONIELECOMP), Cholula Puebla,Mexico, pp. 50-58, 2018.[14] D. Mellinger and V. Kumar, “Minimum snaptrajectory generation and control for quadrotors”,In: Proc. of 2011 IEEE International Conf. onRobotics and Automation, Shanghai, China, pp.2520-2525, 2011.[15] D. Mellinger, N. Michael, and V. Kumar,“Trajectory generation and control for preciseaggressive maneuvers with quadrotors”, TheInternational Journal of Robotics Research, Vol.31, No. 5, pp. 664-674, 2012.[16] P. Checchin, F. Gerossier, C. Blanc, R. Chapuis,and L. Trassoudaine, “Radar scan matching slamusing the fourier-mellin transform”, Field andService Robotics, Vol. 62, pp. 151-161, 2010.[17] J. Engel, T. Schops, and D. Cremers, “LSDSLAM: Large-scale direct monocular SLAM”,In: Proc. of European Conf. on Computer Vision,Springer, Zurich, Switzerland, pp. 834-849,2014.[18] G. Sibley, C. Mei, P. Newman, and I. Reid, “Asystem for large-scale mapping in constant-timeusing stereo”, International Journal of RoboticsResearch, pp. 834-849, 2010.[19] M. Muller, S. Lupashin, and R. D’Andrea,“Quadrocopter ball juggling”, In: Proc. of 2011IEEE/RSJ International Conf. on IntelligentRobots and Systems, San Francisco, California,USA, pp. 5113-5120, 2011.[20] T. C. Wang, C. S. Tong, and B. L. Xu, “AGVnavigation analysis based on multi-sensor datafusion”, Multimedia Tools and Applications,Vol. 79, No. 7, pp. 5109-5124, 2020.International Journal of Intelligent Engineering and Systems, Vol.14, No.2, 2021DOI: 10.22266/ijies2021.0430.31

Received: October 23, 2020.Revised: January 14, 2021.352[21] T. Pozderac, J. Velagić, and D. Osmanković,“3D Mapping Based on Fusion of 2D Laser andIMU Data Acquired by Unmanned AerialVehicle”, In: Proc. of 2019 6th InternationalConf. on Control, Decision and InformationTechnologies (CoDIT), Paris, France, pp. 15331538, 2019.[22] F. Huang, H. Yang, X. Tan, S. Peng, J. Tao, andS. Peng, “Fast Reconstruction of 3D Point CloudModel Using Visual SLAM on Embedded UAVDevelopment Platform”, Remote Sensing, Vol.12, No. 20, pp. 3308, 2020.[23] A. Bachrach, R. He, and N. Roy, “Autonomousflight in unknown indoor environments”,International Journal of Micro Air Vehicles, Vol.1, No. 4, pp. 217-228, 2009.[24] M. Achtelik, M. Achtelik, S. Weiss, and R.Siegwart, “Onboard IMU and monocular visionbased control for MAVs in unknown in-andoutdoor environments”, In: Proc. of 2011 IEEEInternational Conf. on Robotics and Automation,Shanghai, China, pp. 3056–3063, 2011.[25] M. Blosch, S. Weiss, D. Scaramuzza, and R.Siegwart, “Vision based MAV navigation inunknown and unstructured environments”, In:Proc. of 2010 IEEE International Conf. onRobotics and Automation, Anchorage, Alaska,USA, pp. 21-28, 2010.[26] F. Liu, J. Liu, Y. Yin, W. Wang, D. Hu, P. Chen,and Q. Niu, “Survey on WiFi-based indoorpositioning techniques”, IET Communications,Vol. 14, No. 9, pp. 1372-1383, 2020.[27] J. Kwak and Y. Sung, “Beacon-Based IndoorLocation Measurement Method to EnhancedCommon Chord-Based Trilateration”, Journalof Information Processing Systems, Vol. 13, No.6, pp. 1640-1651, 2017.International Journal of Intelligent Engineering and Systems, Vol.14, No.2, 2021DOI: 10.22266/ijies2021.0430.31

the autonomous navigation of these systems. The global positioning system (GPS) is used for external autonomous navigation [1]. Because GPS signals are typically absent or weak indoors, autonomous navigation is difficult [2]. There are various approaches for independent indoor navigation which have been proposed in recent years.

Related Documents:

UAV Task-Force Final Report Chapter 1 3 11 May 2004 1 INTRODUCTION 1.1 BACKGROUND The Joint JAA/EUROCONTROL initiative on UAVs (hereinafter addressed by “UAV Task-Force” or “UAV T-F”) was established in September 2002 on the basis of a joint decision of the JAA and EUROCONTROL governing bodies. This decision was taken in reaction to the growing European UAV Industry and their .

Page 2 Autonomous Systems Working Group Charter n Autonomous systems are here today.How do we envision autonomous systems of the future? n Our purpose is to explore the 'what ifs' of future autonomous systems'. - 10 years from now what are the emerging applications / autonomous platform of interest? - What are common needs/requirements across different autonomous

Comfortex Cellular, Prelude Shades and Cellular Blinds Price List and Reference Guide Effective April 1, 2018 This price list and reference guide contains product pricing, product specifications and technical information for the complete line of Comfortex Cellular, Prelude Shades and Odysee Cellular Blinds. Cellular and Prelude Shades Overview

1.3 Chapter Outline 1 2 3 2 LITERATURE REVIEW 4 2.1 Recent UAV Development 2.2 Design of Fuselage and Empennage of UAV 2.3 Ways of Deployment of UAV 2.4 Breakthrough in Aerospace Composites Manufacturing 2.5 Low Cost Composites Structure Manufacturing Techniques 2.6 Low Cost Expandable UAV 4 7 9 13 15 19

Unmanned aerial vehicle( UAV), Virtual simulation, Visualization ABSTRACT: With. the. advent of the 5G era of digital smart city, "UAV Application" is booming, and there is more and more demand for UAV remote sensing technology. How to cultivate high-tech application talents of UAV has become the primary problem to be solved in

(Figure 2), while the UAV (or drone/UAS) used was senseFly's eBee Plus UAV. This UAV had its built-in RTK/PPK function enabled (Figure 2) and was equipped with a senseFly S.O.D.A RGB camera. Figure 2: senseFly's eBee Plus UAV (left) with Trimble's SX10 hand controller (center) and SX10 scanning total station with carry case (right).

Challenges in developing autonomous UAV systems & applications Complexity of advanced autonomous algorithms Need of end-to-end workflows Ensuring system quality and reducing flight risk Which of these challenges have you encountered? Enter in chat if you have others. Also enter in chat if you have tips on how you have resolved these challenges.

analisis akuntansi persediaan barang dagang berdasarkan psak no 14 (studi kasus pada pt enseval putera megatrading tbk) kementerian riset teknologi dan pendidikan tinggi politeknik negeri manado – jurusan akuntansi program studi sarjana terapan akuntansi keuangan tahun 2015 oleh: novita sari ransun nim: 11042014