Obstacle Detection And Avoidance For Mobile Robots

3y ago
41 Views
4 Downloads
769.26 KB
64 Pages
Last View : 21d ago
Last Download : 3m ago
Upload by : Sasha Niles
Transcription

Christopher A. RytherOle B. MadsenObstacle Detection and Avoidancefor Mobile RobotsBachelor’s Thesis June 2009

Christopher A. RytherOle B. MadsenObstacle Detection and Avoidancefor Mobile RobotsBachelor’s Thesis June 2009

Obstacle Detection and Avoidance for Mobile RobotsReport written byChristopher A. RytherOle B. MadsenAdvisor(s)Nils Axel Andersen (naa@elektro.dtu.dk)Ole Ravn (or@elektro.dtu.dk)DTU ElektroTechnical University of DenmarkDK-2800 Kgs. LyngbyDenmarkE-mail: studieadministration@elektro.dtu.dkProject period:February - June, 2009ECTS:15Education:Bachelor of Science in Engineering (BSc)Field:Electro technologyClass:1 (public)Edition:1st editionRemarks:This report is submitted as partial fulfillment of the requirements for graduation in the above education at the TechnicalUniversity of Denmark.Copyrights:c C. Ryther & O. Madsen, 2009

AbstractThis report details the work performed over five months for a bachelor’sthesis at the Technical University of Denmark (DTU). The project presentsan obstacle avoidance plug-in module for a laser scanner application. Acommon wavefront algorithm is used for path planning and a fixed cell sizegrid map is used for the internal representation of the environment. Thefundamental parts of the obstacle avoidance functionality are first testedin simulated environments, and ultimately the entire system is tested ona differential-drive robot mounted with a laser scanner, in three differentscenarios. The report concludes that, under the given conditions, the plugin proves to effectively complete the proposed test missions while avoidingobstacles.

SummaryMobile robots are complex systems comprised of numerous sub-technologies,many of which still have a lot of potential for improvement. Navigation andmission planning, which are some of the most critical aspects of a mobilerobot, are the primary subjects of this thesis.Using a small differential-drive robot, mounted with a laser scanner, thisproject seeks to implement an obstacle avoidance system and validate its effectiveness through different test scenarios. For this purpose the two aspectsof greatest importance are the internal representation of the surroundingsand the search algorithm with which to plan a collision-free route.The data received from the laser scanner is transformed and mapped intoan internal fixed cell size grid map. From this map a modified wavefrontalgorithm is used to plan a safe path for the robot. Other functionality suchas relocating the goal when it is unreachable, and resetting the map for longmissions is implemented to increase the number of scenarios the plug-in canhandle.The obstacle avoidance system is tested on different platforms throughoutthe development. It is first partially simulated and tested in MatLab andthen in C. Finally the whole program is implemented as a plug-in for thelaser scanner in C .The entire system is tested as a plug-in on the mobile robot in three scenariosincluding a custom one-way maze and a long obstacle-filled hallway. Missionsof up to 10 meters have been completed in this way.In conclusion, the fundamental parts of the plug-in work perfectly underideal conditions and can navigate a mobile robot through many types ofenvironments. However because the plug-in is sensitive to robot odometry,and because the world can present many more scenarios than tested in theproject, there is much potential for future work.

Contents1 Introduction11.1Thesis Statement . . . . . . . . . . . . . . . . . . . . . . . . .11.2Work by Others . . . . . . . . . . . . . . . . . . . . . . . . . .22 The SMR Platform32.1The Small Mobile Robot . . . . . . . . . . . . . . . . . . . . .32.2Laser Scanner . . . . . . . . . . . . . . . . . . . . . . . . . . .43 Map Building53.1Map Representations . . . . . . . . . . . . . . . . . . . . . . .53.2Fixed Cell Size Grid Map . . . . . . . . . . . . . . . . . . . .63.3Bresenham Line Algorithm . . . . . . . . . . . . . . . . . . .73.4Map Dilation . . . . . . . . . . . . . . . . . . . . . . . . . . .83.5Map Reset . . . . . . . . . . . . . . . . . . . . . . . . . . . . .94 Path Planning114.1Types of Path Planning . . . . . . . . . . . . . . . . . . . . .114.2Goal Location . . . . . . . . . . . . . . . . . . . . . . . . . . .114.3The Wavefront Planner . . . . . . . . . . . . . . . . . . . . .124.4Planning The Route . . . . . . . . . . . . . . . . . . . . . . .144.5Line Finder . . . . . . . . . . . . . . . . . . . . . . . . . . . .15

5 Program Structure175.1Program Structure . . . . . . . . . . . . . . . . . . . . . . . .175.2SMRCL Drive . . . . . . . . . . . . . . . . . . . . . . . . . . .206 Results6.16.221Simulations . . . . . . . . . . . . . . . . . . . . . . . . . . . .216.1.1Testing the Grid Map and Dilated Map . . . . . . . .216.1.2Testing the Wavefront Planner . . . . . . . . . . . . .22Practical Tests . . . . . . . . . . . . . . . . . . . . . . . . . .236.2.1Laser Scan to Grid Map . . . . . . . . . . . . . . . . .236.2.2Plan Path from Dilated Map . . . . . . . . . . . . . .246.2.3Scenario: Simple Avoidance with a Few Boxes . . . .246.2.4Scenario: Maze . . . . . . . . . . . . . . . . . . . . . .266.2.5Scenario: Hallway . . . . . . . . . . . . . . . . . . . .276.2.6Repeated Runs . . . . . . . . . . . . . . . . . . . . . .297 Conclusion7.131Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . .327.1.1Implementation in Autonomous Tractor . . . . . . . .327.1.2Gradient Map . . . . . . . . . . . . . . . . . . . . . . .327.1.3Odometry Correction Using Pattern Recognition . . .337.1.4Obstacle Tracking . . . . . . . . . . . . . . . . . . . .33Bibliography34Appendix36A Command Reference for SMR-CL37B User Manual39C Example SMR-CL script41

viiD Results from repeated runs43

viii

List of Figures2.1Laser scanner example . . . . . . . . . . . . . . . . . . . . . .43.1Variable cell size map . . . . . . . . . . . . . . . . . . . . . .63.2Fixed cell size representation of figure 3.1 . . . . . . . . . . .73.3Bresenham’s line algorithm . . . . . . . . . . . . . . . . . . .73.4Scattered measurement points averaged by dilation . . . . . .84.1Clearing the area around the robot . . . . . . . . . . . . . . .144.2Drawing a circle around the robot . . . . . . . . . . . . . . .155.1Program archictecture for control of the SMR . . . . . . . . .175.2Temporal diagram . . . . . . . . . . . . . . . . . . . . . . . .185.3Flowchart diagram of the plug-in . . . . . . . . . . . . . . . .196.1Simulated map dilation and wavefront expansion . . . . . . .226.2Testing the grid map with real data . . . . . . . . . . . . . .236.3Testing path planning with real data . . . . . . . . . . . . . .246.4Photograph of simple setup . . . . . . . . . . . . . . . . . . .256.5Grid map from the simple setup. . . . . . . . . . . . . . . .256.6Rotated grid map from the simple setup . . . . . . . . . . . .266.7Photograph of maze . . . . . . . . . . . . . . . . . . . . . . .26

xLIST OF FIGURES6.8Grid map from the maze . . . . . . . . . . . . . . . . . . . . .276.9Photograph of the hallway . . . . . . . . . . . . . . . . . . . .286.10 Combined grid map from the hallway. . . . . . . . . . . . .28

Chapter1IntroductionMobile robots have until recently primarily been used in industrial production and for military operations. Nowadays however the technology isprogressing towards so-called service robots, intended to assist humans ineveryday life. These type of robots have been in development for years andwill soon influence many aspects of our lives.Autonomous navigation in obstacle filled environments has always presenteda considerable challenge. Solving this problem would allow mobile robots tomove about and complete tasks such as cleaning rooms, assisting disabledpeople and many other missions, without the risk of damaging themselvesor their environment.To navigate effectively the robot must be able to perceive and map thesurrounding world accurately. It is also absolutely critical for mission successthat the robot successfully identifies all obstacles around it. In this projectthe sensor used for these tasks is a 240 laser scanner mounted on a smalldifferential-drive mobile robot.1.1Thesis StatementThe aim of the project is for an autonomous mobile robot to successfullynavigate in a completely unknown indoor environment. Success criteriainclude, but are not limited to A forward velocity of at least 0.2 m/s.

2Introduction No practical restrictions on how far away the robot’s goal can be located in relation to the starting point. Real-time stability. Plug-in functionality. No collisions with obstacles.Using the data collected from the laser scanner and the robot’s odometry, theproject seeks to implement an internal grid-based map and use a wavefrontpath planning algorithm to succesfully avoid all obstacles.The system in its entirety is to be written in C/C and used as a plug-inby the robot’s internal laser scanner server, thereby eliminating most of thedelay associated with wireless communication.Also, all fundamental parts of the plug-in are to be designed in modules sothat the plug-in can be used as an open library for future use.1.2Work by OthersSeveral of the points outlined in section 1.1 have been addressed by others,notably in (Galve, 2008). The work in this project is based on (Galve, 2008)— notably the theory behind coordinate transformations and map representation methods. However, Galve’s project is implemented in MatLab, andas such is not particularly optimized for processing speed. This is evidentfrom the slow movement of the robot during execution.The actual control of the robot is interfaced by the MRC program, developedby Associate Professor Nils A. Andersen. The communication with the robotis handled by the ulmsserver application, developed by Assistant ProfessorJens Christian Andersen.

Chapter2The SMR Platform2.1The Small Mobile RobotFor the practical implementation of the obstacle avoidance system in thisproject, the Small Mobile Robot (SMR) platform is used. These robots weredeveloped by DTU, using mostly off-the-shelf components, and are highlycustomizable and versatile. For real time stability each robot runs a versionof Linux installed with RTAI (RealTime Application Interface)1 .Interfacing the SMR’s hardware is done through the MRC and ulmsserverapplications. MRC provides a way to control the movement of the wheelsand measure the robot’s pose in (x; y; θ)-format, along with measurementsof the various sensors, which are of less importance to this project.To control the SMR movement, a language called SMR-CL developed by(Andersen and Ravn, 2004) is used. SMR-CL is an intuitive script languageintended for the real-time control of the SMR without sacrificing capability.A 240 laser scanner is mounted at the front of the robot. A server usedto access the laser scanner, the ulmsserver application mentioned before, isrun locally on the SMR. It provides measurements from the laser scanner ina XML-based format, each measurement also includes the odometry of therobot at the time the measurement was made.1http://www.rtai.org/

42.2The SMR PlatformLaser ScannerThe scanner is a Hokuyo URG-04LX and has a scanning angle of up to240 . In short, the scanner is both a light source, emitting laser light intoa rotating mirror, and a sensor. After leaving the mirror, the laser light isreflected off of objects around the scanner and returned.The scanner has a maximum measuring range of up to 4 meters. It has anangular resolution of θ 0.352 , and a range accuracy of r 10 mm within1 m. The accuracy is however inversely proportional to the measurementdistance, which means that from 1 m–4 m the accuracy is 1% (Hokuyo, July2005). When a scan is made, it becomes available through the ulmsserverapplication. An example of a scan is seen in figure 2.1. Note that when ameasurement has data where no reflection was measured, due to no obstaclesbeing within maximum measuring range, that data is set to 4 meters.Although the scanner has a field of view of 240 , only the middle 180 areconsidered. Using the extra 30 on both sides would in some cases give aslightly better view, but they are often blocked by the front wheels if therobot has been turning. Therefore the extra data is discarded, in order toavoid misleading measurements.Figure 2.1: Laser scanner example. Left: Photograph of setup. Right: Plot oflaserscan data in MatLab.

Chapter3Map BuildingIn order to adequately process the environment around the robot, an effectiveinternal representation of the environment is needed. The success criteriafor our representation are both a high accuracy and a low execution time.Since the robot at hand only moves in two dimensions, the entire processcan be simplified to a representation of the two dimensional planar worldaround the robot. Because robot technology aims to minimize workloadand hardware requirements, numerous solutions to this problem have beendeveloped over the years (Siegwart and Nourbakhsh, 2004), two of whichare discussed in this chapter.3.1Map RepresentationsBy approximating objects in the real world with square cells in a largemap, it is possible to find a balance between workload and accuracy so thatexecution time is low and the path planned still is safe. A first intuitive ideais to represent a large obstacle with large squares within its boundaries andthen use smaller squares to approximate the rest, as seen in figure 3.1.In programming languages however, it is quite difficult to represent thistype of variable cell sized map. A fixed cell size map, on the other hand, canbe implemented using a simple 2D matrix of numbers, where each numberin the matrix corresponds to a cell of fixed resolution in the real world.With a low computational time, resulting in a high update frequency, anda laser scanner, this method of map representation can even correct errors

6Map BuildingFigure 3.1: Left: Simulated obstacle as would be seen by laser scanner. Right:Varible cell size grid map representation of the obstacle.made from previous scans. The downside is not being able to map theenvironment with more precision than given by the map resolution. It alsoincreases memory usage, as the variable map in figure 3.1 has about 120cells, while the fixed map in figure 3.2 has 1024 cells.In this project a map resolution of 5 cm is chosen since anything smaller hasbeen seen to result in large odometry errors (Galve, 2008). This means thatthe largest possible error in the internal map is 2.5 cm.3.2Fixed Cell Size Grid MapFigure 3.2 shows a fixed grid map, where the black cells are objects that thelaser scanner has detected, while grey cells indicates a clear area and whitecells indicates an unknown area. In order to keep the internal map staticwith regard to orientation, there is a need for a transformation matrix fromlaser scanner coordinates to map coordinates (Galve, 2008). cos(θ) sin(θ) OxlXsXm sin(θ) cos(θ) Oyl Ys Ym .(3.1)00111In equation 3.1 θ is the orientation of the scanner in the global coordinatesystem and (Oxs ; Oys ) are the coordinates of the scanner’s coordinate systemin respect to the robot’s local coordinate system. By using this transformation it is easy to convert between coordinates expressed in laser coordinates(Xs ; Ys ) and map coordinates (Xm ; Ym ).

3.3 Bresenham Line Algorithm7Figure 3.2: Fixed cell size grid map representation of the obstacle in figure 3.1.3.3Bresenham Line AlgorithmWhen a laser scan consisting of angle/distance pairs has been transformedinto the correct (x, y) coordinates, the corresponding cells in the internalmap are marked to represent obstacles. It is also necessary to mark as clearall the cells between the obstacles and the robot along the measurementpath, so that the robot knows these cells are traversable. This is done withthe Bresenham Line Algorithm (Bresenham, January, 1965). This algorithmdetermines which points need to be painted in order to achieve a good approximation to a line between two given points. However, given that thelaser scans have a finite angular resolution, there is a tendency to be unpainted areas between each measurement. To avoid this, each point fromthe Bresenham algorithm is dilated into a 5-cell cross as seen in figure 3.3.Figure 3.3: Example of two points and the bresenham line that connects them.The lighter grey cells are part of the line due to 5-cell dilation. Allgray cells will be painted.

83.4Map BuildingMap DilationWhen planning a path for a robot to follow, one would intuitively considerevery single point containing the robot and assert that none of those pointscollide with the environment. However, keeping track of all the points atonce and planning a route based on them is neither effective nor easy to do.An easy solution to this problem is representing the robot as a single pointin the grid map. Safe route planning is then achieved using map dilation(Jazayeri et al., March 2006).Map dilation is the process in which every single cell classified as part of anobstacle, is dilated with a certain amount so that the robot’s size effectivelyis represented as a portion of the obstacles. This way the path planner neednot consider the size of the robot in its calculations.There are two methods of performing the dilation. The dilation can eitherbe done by adding a square or a circle around each obstacle point. The firstmethod seems intuitively correct since the robot itself is square, but thistype of dilation is inaccurate since the robot’s orientation cannot be forseennear a given point. Performing the dilation with a circle on the other hand,eliminates all orientation considerations.There are other advantages to this general approach as well. The mostnoticeable is the averaging of measurement points. Since every single pointis dilated with a relatively large area, the accuracy of the laser scannerbecomes less of an issue, because most of the points’ dilation overlap. Thisis seen in figure 3.4, where the distant measurements to the right still producean even dilated area.Figure 3.4: Scattered meassuring points from a laser scan (black) and the surrounding dilation cells (grey).

3.5 Map Reset3.59Map ResetBecause the internal map has a finite size and the universe does not, aproblem arises whenever the position of the goal is outside the boundariesof the internal map. There are a few solutions to this problem: Either createmultiple connected maps, dynamically expand the internal map or reset theinternal map during the mission. With the two first solutions the robotwould need to allocate relatively large amounts of memory during executionand that would be against real-time guidelines. Instead, the last option ischosen. Every time the robot is within 0.5 m of any of the edges of theinternal map, the map is reset and the robot is placed in the middle of thenew map. This ensures that the robot is never out of bounds and that thecurrent provisional goal can be placed on the current internal map. In thismanner, the robot’s travelling distance is not limited by programming.

10Map Building

Chapter4Path Planning4.1Types of Path PlanningThere are several ways to plan a movement path from a fixed-cell grid map,but among the simplest algorithms are the wavefront planner and the A*search algorithm (Galve, 2008). A* is computationally faster, but has a morecomplex implementation. For the sake of this project, the simpler wavefrontplanner is used. This can be justified by the fact that the searches can beexecuted extremely fast in C. While execution time was a major factor inGalve’s selection of A*, this project is more focused on the implementationas a plug-in, and can later be expanded with a more thorough or faster pathplanning algorithm.The path planning is implemented as two seperate functions, one that mapsthe distance associated with each cell, and one that calculates the actualpath from this map. This modular structure allows for

Obstacle Detection and Avoidance for Mobile Robots Report written by Christopher A. Ryther Ole B. Madsen Advisor(s) Nils Axel Andersen (naa@elektro.dtu.dk) Ole Ravn (or@elektro.dtu.dk) Project period: February - June, 2009 ECTS: 15 Education: Bachelor of Science in Engineering (BSc) Field: Electro technology Class: 1 (public) Edition: 1st edition Remarks: This report is submitted as partial .

Related Documents:

The obstacle detection is done using Sharp distance IR sensors. After detecting the obstacle and this signal is passed to the ATmega2560 microcontroller on receiving the signals it guides the vehicle to moves in a different direction by actuating the motors through the motor driver. Keywords—Autonomous Vehicle, Obstacle Detection, Obstacle Avoidance, Sharp Distance IR sensors long range(20cm .

developing assistive technology for obstacle avoidance for visually impaired people, because it has always been con-sidered a primary requirement for aided mobility. Obstacle avoidance technology needs to address two issues: obsta-cle detection and obstacle warning. The obstacle detection means the perception of potentially hazardous objects in the environment ahead of time, while the latter .

using the thumb-controlled joystick integrated in the handle. Obstacle avoidance algorithm applies when obstacle is detected. Throughout the project, an assistive-guide robot that operates for object detection and location detection is designed. During object detection, the robot will avoid any disturbance in

The project “Obstacle Detection and Avoidance by a Mobile Robot” deals with detection and avoidance of the various obstacles found in an environment. We divided the task of creating the robot into five phases namely LED and LDR component designing, comparator, microcontroller, motor driver and the motor. While designing and construction of the

Bruksanvisning för bilstereo . Bruksanvisning for bilstereo . Instrukcja obsługi samochodowego odtwarzacza stereo . Operating Instructions for Car Stereo . 610-104 . SV . Bruksanvisning i original

learning, deep learning and so on, have been implemented for not only computer vision, but . 6.5 Semantic or Instance Segmentation based Multi-Stage Model . . . . . . . . . . 63 v. . To make the drone arrive at a destination and return home safely and quickly, an excellent obstacle avoidance algorithm is necessary. Obstacle avoidance for .

IEEE Robotics & Automation Magazine 8, pp. 29--37 (2001) 6. Borenstein, J., Koren, Y.: The Vector Field Histogram - Fast Obstacle Avoidance for Mobile Robots. IEEE Transactions on Robotics and Automation 7, pp. 278--288 (1991) 7. Borenstein, J., Koren, Y.: Obstacle Avoidance with Ultrasonic Sensors. IEEE Journal of Robotics and Automation 4, pp .

Obstacle Detection and Avoidance using Stereo Vision System with Region of Interest (ROI) on FPGA . Mr. Rohit P. Sadolikar1, Prof. P. C. Bhaskar2. 1,2Department of Technology, Shivaji University, Kolhapur-416004, Maharashtra, India. Abstract— Stereo vision is an area of study in the field of Machine vision that recreate the human vision system by using two or more 2- such as „Obstacle .