OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles...

98
OBSTACLE DETECTION AND AVOIDANCE FOR AN AUTONOMOUS FARM TRACTOR by Keith W. Gray A thesis submitted in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE in Electrical Engineering Approved: _____________________ _____________________ Kevin Moore Nick Flann Major Professor Committee Member _____________________ _____________________ Kay Baker Noelle E. Cockett Committee Member Interim Dean of Graduate Studies UTAH STATE UNIVERSITY Logan, Utah 2000

Transcript of OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles...

Page 1: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

OBSTACLE DETECTION AND AVOIDANCE

FOR AN AUTONOMOUS FARM TRACTOR

by

Keith W. Gray

A thesis submitted in partial fulfillmentof the requirements for the degree

of

MASTER OF SCIENCE

in

Electrical Engineering

Approved:

_____________________ _____________________Kevin Moore Nick FlannMajor Professor Committee Member

_____________________ _____________________Kay Baker Noelle E. CockettCommittee Member Interim Dean of Graduate Studies

UTAH STATE UNIVERSITYLogan, Utah

2000

Page 2: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

ii

Copyright Keith W. Gray 2000

All Rights Reserved

Page 3: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

iii

Abstract

Obstacle Detection and Avoidance

for an Autonomous Farm Tractor

by

Keith W. Gray, Master of Science

Utah State University, 2000

Major Professor: Dr. Kevin L. MooreDepartment: Electrical and Computer Engineering

The Center for Self-Organizing and Intelligent Systems (CSOIS) is developing

autonomous ground vehicles for use in farming applications. CSOIS has targeted

obstacle detection and avoidance as key challenges in real-world implementation of

autonomous farming vehicles. A range sensor, giving real-time updates of the

surrounding environment, performs obstacle detection. Obstacle avoidance is

accomplished through a combination of global and local avoidance subsystems that deal

with both known and unknown obstacles in the operating area of the vehicle. The global

avoidance subsystem is a mission-level path planner that preplans paths around all known

obstacles while the local avoidance subsystem maneuvers the tractor around unknown

obstacles. The local avoidance subsystem consists of an obstacle filter and an obstacle

avoidance algorithm. The obstacle filter reports unknown obstacles to the path planner

and enables the avoidance algorithm if the preplanned path is blocked. The avoidance

Page 4: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

iv

algorithm learns of all the known and unknown obstacles from the obstacle filter so that

all obstacles can be safely avoided. The task of the avoidance algorithm is to maneuver

the vehicle around the obstacle with the goal of returning to the preplanned path as

quickly as possible. This thesis describes the unique challenges to autonomous farm

vehicle navigation and the solutions reached that enabled success.

(98 Pages)

Page 5: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

v

Acknowledgments

I would like to thank the many people that helped me on this project. First and

foremost I would like to thank the John Deere project team. Mel, Mitch, Sarah, and Don.

You four are the real reason the obstacle detection and avoidance worked on the tractor.

Your input and suggestions and patience helped me see past the obstacles in my thinking

and really shaped the final product. I would also like to thank my professors at Utah

State University, Dr. Moore and Dr. Flann, who gave valuable input and suggestions as

well as the opportunity to be part of this great team of engineers. I would like to thank

John Deere Inc. for their belief in our team and for the growth that I received because of

the project. I want to thank my wife, Malia, who supported me through this whole ordeal

and always encouraged me when I was burned out, even though I rarely saw her that

summer, I love her deeply. Finally I want to thank my God who has blessed me with the

desire to learn and grow and the ability to understand that I am nothing without him.

Keith W. Gray

Page 6: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

vi

Contents

Page

Abstract ............................................................................................................................. iii

Acknowledgments.............................................................................................................. v

List of Tables...................................................................................................................viii

List of Figures ................................................................................................................... ix

1 Introduction ................................................................................................................... 1

1.1 Motivation ............................................................................................................... 11.2 Summary of Results ................................................................................................ 21.3 Thesis Outline ......................................................................................................... 4

2 Overall System............................................................................................................... 6

2.1 Base Station............................................................................................................. 82.2 Master Node ............................................................................................................ 82.3 Path Planner Node................................................................................................. 112.4 Sensor Node .......................................................................................................... 12

3 Obstacle Detection Technology................................................................................... 16

3.1 Obstacle Detection Sensors................................................................................... 173.1.1 CCD Camera ............................................................................................. 173.1.2 Ultrasonic Sensors..................................................................................... 203.1.3 Scanning Laser .......................................................................................... 213.1.4 3D Scanning Lasers................................................................................... 233.1.5 Millimeter Wave Radar............................................................................. 23

3.2 Comparsion of Sensors.......................................................................................... 243.2.1 Weather ..................................................................................................... 243.2.2 Light .......................................................................................................... 243.2.3 Detection Distance .................................................................................... 253.2.4 Response Time .......................................................................................... 253.2.5 Cost............................................................................................................ 263.2.6 Summary ................................................................................................... 27

3.3 Detection Sensor Used on Tractor ........................................................................ 28

Page 7: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

vii

4 Obstacle Avoidance..................................................................................................... 31

4.1 Global and Local Obstacle Avoidance.................................................................. 314.2 Vision of Obstacle Avoidance............................................................................... 324.3 Obstacle Avoidance Used in Research.................................................................. 33

4.3.1 Wall-Following ......................................................................................... 344.3.2 Black Hole................................................................................................. 364.3.3 Path Computing......................................................................................... 384.3.4 Potential Fields.......................................................................................... 404.3.5 Histogram.................................................................................................. 41

5 Obstacle Detection and Avoidance ............................................................................ 52

5.1 Obstacle Detection and Avoidance Data Flow ..................................................... 525.2 Obstacle Detection ................................................................................................ 535.3 Obstacle Filter ....................................................................................................... 58

5.3.1 Obstacle Filter Flow of Control ................................................................ 605.3.2 Dynamic Obstacles Overview................................................................... 62

5.4 Avoidance.............................................................................................................. 64

6 Simulations and Vehicle Tests.................................................................................... 77

7 Conclusions and Recommendations .......................................................................... 81

References ........................................................................................................................ 84

Appendix A ...................................................................................................................... 87

Page 8: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

viii

List of Tables

Table Page

5.1 LMS Continuous Data Report Packet. ................................................................. 54

5.2 PC to LMS Reset Packet. ..................................................................................... 56

5.3 PC to LMS Request for Continuous Data. ........................................................... 56

Page 9: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

ix

List of Figures

Figure Page

2.1 Overall system architecture operation.................................................................... 6

2.2 Master node architecture. ....................................................................................... 9

2.3 Sensor node architecture. ..................................................................................... 13

3.1 Comparision of obstacle detection sensors. ......................................................... 27

3.2 Laser range finder mounted on tractor. ................................................................ 29

4.1 Example of wall following technique as presented in [30].................................. 35

4.2 Example of certainty grid. .................................................................................... 43

4.3 Vehicle tracking a point on desired path. ............................................................. 44

4.4 Example of blocked directions. ............................................................................ 48

5.1 Obstacle detection and avoidance flow diagram.................................................. 53

5.2 LMS output with 0.5° resolution.......................................................................... 55

5.3 LMS obstacle determination. ............................................................................... 58

5.4 Obstacle filter flow diagram................................................................................. 61

5.5 Certainty grid with active window overlay. ......................................................... 67

5.6 Precompiled vector direction matrix. ................................................................... 68

5.7 Precompiled vector magnitude matrix. ................................................................ 69

5.8 Classic boy scout orienteering problem. .............................................................. 72

5.9 Exit state............................................................................................................... 74

5.10 Straight state. ........................................................................................................ 74

5.11 Return state........................................................................................................... 75

Page 10: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

x

5.12 Stop state. ............................................................................................................. 75

5.13 Block diagram of steering control during avoidance. .......................................... 76

6.1 Map of VFH+ simulation. .................................................................................... 78

6.1 Map of VFH+ simulation (cont.). ........................................................................ 79

Page 11: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

Chapter 1

Introduction

This thesis presents the design and implementation of an obstacle detection and

avoidance system for use on an automated tractor. The research described in this thesis

was carried out as part of a larger project aimed at demonstrating obstacle detection and

avoidance for an autonomous tractor operating in a typical farming environment.

Specific contributions of the thesis include an assessment of the different obstacle

detection sensors and obstacle avoidance algorithms used in autonomous vehicle

research, and development of a new obstacle avoidance system for the tractor, including a

unique obstacle avoidance algorithm. Simulation and testing results show the

effectiveness of the approach.

1.1 Motivation

The idea of an autonomous tractor is not a new one. Every child who lived on a

farm had dreams of a farm vehicle that could take over the plowing and harvesting

chores, saving them from the boredom of driving long monotonous hours. With

advances in GPS technology and computerization of farm equipment this dream is now

closer to reality. Now the simple, yet boring, task of plowing and harvesting a field can

be turned over to an autonomous tractor that will never get tired and that will do the job it

is tasked to do. While an autonomous farm tractor is a realizable dream, there are still

several concerns that need to be addressed before they become common on the family

farm. Two of the biggest concerns are first, how the tractor will see its environment and,

second, how the tractor will react to its environment.

Page 12: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

2

The ability to sense the surrounding environment is an important issue for any

autonomous vehicle. As human beings we have the most powerful set of sensors and the

best possible computer available for interpreting our environment. For an autonomous

tractor, the question becomes: is it possible to find sensors that can gather sufficient

environmental data for safe vehicle navigation? Then, once the environmental data has

been captured, is there a fast and effective way to interpret the data and determine what it

means to the tractor? In order to take the farmer out of the tractor these two questions

must be answered in the affirmative.

Once the environment around the tractor has been determined then the vehicle

must react accordingly. A farmer’s reaction to an unexpected object in his way is nearly

instant and yet thought out. He knows that if he needs to swerve around a big rock that

has become unearthed that he cannot cross over, any hay that has been cut, or travel over

a fence line or ditch. As with the detecting of the environment, if the farmer is to be

taken out of the tractor the issue of tractor reaction must be fast and precise. It is

necessary to be able to make correct decisions about the appropriate action needed in

response to the data about the environment.

1.2 Summary of Results

The unique obstacle avoidance algorithm as developed by the author and presented

in this thesis was demonstrated before the project sponsors in a farm field west of Logan

Utah in August of 1999. Different scenarios were used to illustrate the need for obstacle

detection and avoidance as well as demonstrate the effectiveness of the avoidance

algorithm presented in this thesis. The different scenarios included an obstacle on the

Page 13: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

3

path of travel, obstacles to the right or left of the path of travel, known obstacles close to

the path of travel, and dynamic, or moving, obstacles.

In first scenario, an obstacle on the path of travel, the tractor was effective at

determining the environment around the obstacle to make the decision as weather to

avoid to the right or left of the obstacle. If the path of travel was close to the border of

the field then the avoidance algorithm would avoid towards the center of the field. The

next scenario, obstacles to the right or left of the path, the avoidance algorithm correctly

drove the tractor the shortest distance around the obstacle and back toward the path.

The next scenario, known obstacle close to the path of travel, demonstrated the

obstacle filter portion of the obstacle avoidance algorithm. When the obstacle detection

sensors detected a known obstacle, such as the fence line surrounding the field, the

obstacle filter did not inform the avoidance algorithm because the obstacle did not block

the path of travel. If, however, the tractor was avoiding an unknown obstacle and a

known obstacle was encountered in the process, that obstacle was also effectively

avoided.

The final scenario, dynamic obstacles, was demonstrated using a remote controlled

truck. The truck was fitted to be tall enough and wide enough to be seen by the detection

sensor. Three different scenarios with the dynamic obstacle showed the robustness of the

avoidance algorithm. The first scenario had the dynamic obstacle approach the tractor

from the right or left. As the tractor avoided the dynamic obstacle the obstacle continued

to move towards the tractor. When the obstacle was to close to the tractor the avoidance

algorithm stopped the tractor. The next scenario involving the dynamic obstacle had the

Page 14: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

4

obstacle move in front of the tractor, as the tractor started avoiding the obstacle the

obstacle quickly moved out of sight of the tractor and normal tractor driving resumed.

The final scenario with the dynamic obstacle had the obstacle approach the vehicle

causing the tractor to avoid. As the tractor attempted to maneuver around the obstacle the

obstacle moved with the tractor, pushing the tractor further from the desired path of

travel. When the tractor was more than one swath width away from the path of travel the

avoidance algorithm stopped the tractor.

1.3 Thesis Outline

The thesis is organized as follows. Chapter 2 discusses the tractor system,

illustrating how this work fits into the overall project. In Chapter 3 we discuss various

obstacle detection technologies used for autonomous vehicles. We also describe the

obstacle detection sensor used in this project. In Chapter 4 we examine obstacle

avoidance algorithms other researchers have used for autonomous vehicles. Section 4.1

discusses the difference between global and local or reactive avoidance. Section 4.2 lists

the obstacle avoidance requirements that were specified for the overall system developed

in the larger project. Section 4.3 then considers different obstacle avoidance algorithms

and compares them against the requirements in Section 4.2. In particular, subsection

4.3.5 is a detailed description on one obstacle avoidance algorithm that served as the

model for the avoidance algorithm developed by the author. Chapter 5 gives a detailed

explanation of the obstacle detection and avoidance system that was developed for the

autonomous tractor system. This chapter also describes how the system works together

to successfully detect and interpret the environment surrounding the vehicle and then

Page 15: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

5

react according to this environment. Section 5.3 and Section 5.4 detail the unique

approach to the obstacle avoidance algorithm. Chapter 6 presents the simulation and

actual testing of the detection and avoidance system. Chapter 7 concludes the thesis with

suggestions for future work.

Page 16: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

6

Chapter 2

Overall System

In this chapter we describe the overall autonomous tractor system for which the

obstacle detection and avoidance algorithm was developed. Each subsystem in the

system is discussed and we explain how the various subsystems interact with each other

to create a fully autonomous farm tractor. This architecture was developed by the

engineering team as part of the larger project [1].

As seen in Fig. 2.1 the autonomous tractor system is made up of four main

subsystems: the base station, the master node, the path planner node, and the sensor node.

The interaction of each of the subsystems will be illustrated by describing a typical task

UserGui

WirelessLan

EmergencyStop Button Modem

MasterNode

Path PlannerNode

SensorNode

WirelessLan

Modem Sensors

TractorControl

Base Station

Coax

Coax

Coax

Tractor System Architecture

TCP/IP

RS-232

Fig. 2.1: Overall system architecture operation.

Page 17: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

7

for the autonomous tractor. The goal of the mission is to perform a simple sweep

pattern in a field. From the base station we use the User GUI to select the field of

operation. The sensor node conveys to the path planner the exact location of the tractor

and the current heading of the tractor. The path planner uses the position and attitude

data and the desired field of operation to plan a sweep path. The path planner takes into

account the turning radius of the vehicle as well as the size of the farming implement the

tractor may be using while planning the sweep pattern. Once the path planner has

planned a mission the mission is visible on the User GUI at the base station and we can

begin the mission. We then select on the GUI the start mission command. At this point

the path information generated by the path planner is forwarded to the master node and

the sensor node. The master node converts the path information into driving and steering

commands. These commands control the tractor along the path, thus performing the

desired task. The sensor node provides feedback to the master node controller on the

tractor’s position and attitude along the path. If the obstacle detection sensor senses an

obstacle that will impede the tractors travel then the obstacle avoidance algorithm

calculates driving and steering commands that will safely avoid the obstacle. These

commands are relayed to the master node, which carries out the commands. If for some

reason we feel that the tractor is behaving in an unsafe manner we have an emergency

stop button at the base station. This button allows us to shut off the tractor engine and

engage the emergency brake.

Page 18: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

8

The remainder of this chapter will discuss each of these different subsystems and

how they contribute to the vehicle operation as well as to obstacle detection and

avoidance.

2.1 Base Station

The base station’s contribution to vehicle operation and obstacle detection and

avoidance is perhaps the most important of all the other subsystems. The reason for this

is a simple old cliché that states, “seeing is believing.” If a farmer is to turn an

autonomous tractor loose in his field where the liabilities are high if something goes

wrong, then he will want accurate information about the tractor and he will want to stop it

immediately if something goes wrong. This is what the base station provides. The base

station is the communication link between the farmer and the tractor. The base station

consists of a graphical user interface (GUI) and a tractor kill button. The GUI allows the

farmer to task the tractor and then displays in real-time where the tractor is and what it is

doing. The base station also provides a graphical display of the instrument panel from

the tractor cockpit. The GUI also displays in real-time the environment around the

tractor by displaying the obstacle detection sensor information. If the farmer feels the

need to stop the tractor at any time during the course of the mission he can flip a switch

and instantly stop the tractor’s engine and lock the brakes.

2.2 Master Node

The master node performs all the functions needed to drive and steer the tractor.

During obstacle avoidance the avoidance algorithm calculates the proper heading for

Page 19: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

9

Master Controller

Motion ExceptionHandler

Path Tracker

Drive ControllerStatus Broadcaster

Listeners

Script Parser &Trajectory Generator

WatchdogsPosition & Attitude

Processor

Path Store

from Tractor from Nav/Sensor Node from Mission Plannerfrom Joystick

To other Nodes To Wheel Steering

VehicleState WheelDriveVectors

VehicleState

DriveVectors DriveVectors

VehicleState

HandledEnableEnable

PathSegments

PathSegments

PathScripts

CommFailure

KeepAlives

Position & Attitude

Odometry

DriveVectors

Joystick DriveVectors

GPS/FOG

GPS/FOG/ObstacleComputer PathScriptsJoystick DriveVectorsPathScripts

Fig. 2.2: Master node architecture.

collision-free travel, but it must communicate this heading to the master node whose task

is to steer the tractor to that heading. In the remainder of this section we describe how

the master node controls the tractor by discussing the architecture shown in Fig. 2.2 [1].

In order to operate the tractor the master node receives data from four different

sources. The tractor itself has an on-board computer that interfaces with the master node.

Through this link the master node can monitor engine RPM, throttle, and other

parameters necessary for driving. This same link also allows the master node to make

any changes to the tractor’s driving, such as increasing or decreasing the throttle and

steering angle. The joystick is the second source of input data into the master node. The

input from the joystick is interpreted as steering angle and throttle speed. All of the path

information computed by the path planner is the third data input into the master node.

Page 20: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

10

The path information is sent as start points and end points of the desired path along

with the radius of the path. If the radius is zero then the path is a straight line, if the

radius is greater than zero then the path is a circle. The final input to the master node is

from the sensor node. The sensor node communicates to the vehicle’s position and

heading. If the sensor node is avoiding obstacles then a motion exception is also

communicated to the master, indicating the speed and steering angle needed to avoid the

obstacle.

In addition to receiving data from the other nodes the, master node transmits data

to the planner and sensor nodes. In order to keep the other node up to date with the

current vehicle position and heading (heading and position from sensor are augmented in

the master node with vehicle odometery for more accurate readings) and other status

information, the master node has a status broadcaster. The status broadcaster updates

tractor parameters every 250 ms and can be accessed by the sensor and planner node

through a master listener function.

The reason for the inter-node communication is so the master node knows exactly

where the tractor needs to be and which direction it needs to be heading. It is then the

master node’s responsibility to drive the tractor to where it needs to be and steer it in the

direction it needs to be headed. This is the accomplished through the drive controller,

which resolves path commands from the path planner node and position and attitude

information from the sensor node. In the case of obstacle avoidance the path commands

from the planner node are disregarded as the sensor node provides the needed driving

information to the drive controller.

Page 21: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

11

2.3 Path Planner Node

The path planner’s contribution to obstacle detection and avoidance is to provide

mission information to the master and sensor node and to give a global understanding of

the tractor’s environment. In the remainder of this section we discuss how the path

planner creates missions and informs the other nodes of this mission. We also discuss

how the planner gives a global understanding of the tractor’s environment.

Path planning is done using a terrain map, a possibility graph, and a search. The

terrain map stores terrain information of the area where the vehicle will be driven,

including locations of roads, fields, obstacles, etc. The possibility graph, which

represents possible paths through the map, is made up of nodes and edges. Nodes can be

compared to street intersections and edges to streets. Traveling from node to node is

done by following an edge or a combination of edges that connects the two nodes.

Adding nodes and edges to the graph extends the graph and allows travel to previously

unattainable areas. Each edge is assigned a cost based on terrain information stored in

the map. An “A*” search algorithm, as described in [2], uses edge costs to find optimal

paths from one point to another in the graph.

In past projects, obstacle avoidance was done by rejecting edges that passed

through a known obstacle. This simple technique ensures that obstacles will be avoided;

however, it reduces the connectivity of the graph and may unnecessarily prevent access to

certain areas of the map. The path planning for the tractor uses edge-building techniques

that take obstacle shape into account and builds edges that circumvent obstacles.

Page 22: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

12

Once the mission has been constructed it is broken down into path segments.

Each path segment gives a start and endpoint of the path and a radius for the path. A

radius of zero indicates a straight path. Then, there are effectively two types of paths:

lines and curves. All the path segments for the entire mission are sent to the master and

sensor node and stored in order of execution.

Planning a mission is not the only function of the path planner. The other

function is to provide the sensor node with a global understanding of the environment in

which the tractor operates. This allows the sensor node to react during obstacle

avoidance to, not only the objects the detection sensor “sees”, but to objects the sensor

might not see but that are still present. An example of this might be if the tractor is

avoiding an obstacle that is on the desired path of travel with no other obstacles in view

of the detection sensor. From the obstacle detection and avoidance algorithms

perception, avoiding to the right or left of the obstacle makes no difference. However if

the map of the field indicated a pond was to the left of the obstacle then the obstacle

should be avoided by leaving the path to the right. Thus a global understanding of the

path planner assists the avoidance algorithm by indicating obstacles other than those seen

by the detection sensor.

2.4 Sensor Node

Having discussed how the path planner and master nodes contribute to obstacle

detection and avoidance we now focus on the node where obstacle detection and

avoidance are performed. In this section we take a closer look at how obstacle detection

and avoidance are performed by examining the architecture of the sensor node shown in

Page 23: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

13

Fig. 2.3. Before the obstacle detection and avoidance portion of the sensor node is

discussed we need to address the position and attitude sensors that provide the tractor

with its sense of position and heading.

In order to execute missions according to the specifications of the path planner the

tractor must know where it is in the world and the direction it is heading in this world.

This is accomplished by a global position satellite (GPS) receiver and a fiber optic

gyroscope (FOG). The GPS unit is an Ag132 from Trimble navigation, which gives

differential position information that is accurate to within a meter. The FOG is a rate

integrating heading sensor. Integrating the tractor’s rate of turn provides the heading of

the tractor. The heading of the FOG is very accurate but unfortunately needs to have an

external reference at initialization and needs periodic updates during the mission to

FOG

GPS

MasterListener

LaserRangeFinder

MissionPlannerListener

HeadingDecoder

PositionDecoder

ObstacleFilter

StatusBroadcaster

ObstacleController

Rs-232

Rs-232

Broadcasts VehicleStatus to all members

TCP/IPTo Master Node

Rs-232 Enable ObstacleController

Request Map Parameters

TCP/IPObstacle Information

To Mission Planner Node

Map and KnownObstacle Information

TCP/IPMotion ExceptionTo Master Node

TCP/IP fromMaster Node

TCP/IP fromMission Planner

Node

Fig. 2.3: Sensor node architecture.

Page 24: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

14

correct for bias drift from the integration error.

Recall the two questions raised in the introduction: (1) is there a sensor that can

perceive the environment around the tractor well enough to provide data for collision-free

driving while accomplishing the predetermined mission as quickly as possible and (2)

how can we interpret this data and make correct decision about actions needed to avoid

obstacles. The question of reporting the environment will be addressed in chapter 3.

After the sensor reports the environment we then must interpret the data as quickly as

possible so that avoidance headings can be computed.

This leads us to the obstacle filter. The obstacle filter is a way of interpreting the

data from the detection sensor and combining that data with the map of known obstacles

from the path planner to give an up-to-date map of the environment surrounding the

vehicle. This up-to-date map is constructed by combining all known obstacles with any

new previously unknown obstacles that the detection sensor saw. The detection sensor

will see many obstacles, but to avoid dealing with the same obstacles twice, the obstacle

filter ignores any obstacle reported by the detection sensor that was previously known by

the path planner. An up-to-date map allows the tractor to make effective collision-free

heading calculations.

With the obstacle filter providing a map of the surrounding environment, the

obstacle controller can calculate proper heading corrections to avoid striking obstacles.

The algorithm for computing these heading corrections comprises chapter 5. While the

tractor is avoiding obstacles the obstacle controller also ensures that the master node is

steering the vehicle to the proper avoidance heading. It does this by monitoring the

Page 25: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

15

vehicle heading and issuing steering commands to the master node that will bring the

tractor to the calculated avoidance heading.

In this chapter we have discussed the tractor subsystems and how they all work

together to operate the vehicle and to aid in the detection and avoidance of obstacles. In

the next chapter we describe in more detail one of the subsystems, the obstacle detection

sensor.

Page 26: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

16

Chapter 3

Obstacle Detection Technology

Having discussed the tractor system as a whole, in this chapter we discuss the first

topic of this thesis, the obstacle detection sensor. First we consider why the detection

sensor is so vital to autonomous vehicles, whether they be tractors, airplanes, or robots.

We then examine what other researchers have used for detection sensors and what they

have to say about the various technologies that are available. Finally, we briefly describe

the obstacle detection sensor that was used on the tractor throughout the course of the

project.

For people to truly be taken out of the loop in the field of autonomous vehicles,

all of the functions that man performs must be mimicked. A microprocessor takes the

place of the brain; complex algorithms and programs take the place of decision-making;

actuators take the place of feet to accelerate or brake, and hands to steer. Unfortunately,

the most important feature of a person is the hardest to mimic. The ability of a person to

see and take in so much with his eyes is key. Without eyes the microprocessor crunches

the complex algorithms and programs blindly. The program expects a leisurely cruise in

the country, but the vehicle may actually be travelling on perilous terrain. This is the

biggest limitation for autonomous vehicles, with no single perfect solution. Before the

decision of which detection sensor to place on the tractor we examined many different

options. Section 3.1 discusses the different detection sensors that other researchers are

using in this field and then Section 3.2 discusses the detection sensor used for the tractor

project.

Page 27: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

17

3.1 Obstacle Detection Sensors

An examination of various research studies on autonomous vehicles/robots shows

that there are only five to six different types of effective obstacle detection sensors.

These sensors range in price from inexpensive to very expensive. Each of them has their

own unique advantages and disadvantages for different applications. The sensors are not

limited to obstacle detection. Some sensors are used for vehicle localization (the

detection of objects at known locations to determine the vehicle’s location in a map as

explained in [3] and [4]). Other sensors may be used to extract different features in

plants for plant characterization, allowing an autonomous robot to give the proper

fertilizer in the proper amounts to different plants as explained by Harper [5]. Again,

other sensors are used to create maps by mapping passable and impassable terrain as

explained by Howard [6] and by Matsumoto [7]. While all of these applications do not

address our problem of obstacle detection directly, they do give us an idea of how these

different sensors might work in our setup. If a sensor is used effectively to create

accurate maps for the vehicle environment then it is possible to detect obstacles in a

farming environment. Following will be a discussion of each of the different sensors

being used today. Their advantages and disadvantages will be highlighted and the

possibility of use on the tractor project will be examined.

3.1.1 CCD Camera

The first type of detection sensor we consider is the CCD camera. The camera is

considered a passive sensor since it is a sensor that requires ambient light to illuminate its

field of view. The camera also happens to be similar in a crude sense to the human eye.

Page 28: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

18

When two cameras are used together stereo vision is accomplished. Stereo vision

gives us what we want most: range-to-target. Stereo vision is used extensively for

obstacle detection in numerous research projects. Chahl [8] used a single CCD camera in

conjunction with a spherically shaped curved, reflector that enables ultra-wide angle

imaging with no moving parts. Bischoff [9] used cameras to navigate a humanoid service

robot indoors. Apostolopoulos [10] used cameras to aid in navigation and obstacle

detection for a robot searching for meteorites on the Antarctic continent. Omichi [11]

used cameras as part of an array of sensors used for vehicle navigation. Chatila [12] uses

stereo vision to aid in dead reckoning for planetary rovers. Soto [13] used 5 CCD

cameras for reconnaissance and surveillance on All-Terrain Vehicles. Almost every

researcher has at least tried stereo vision, but most, after being unable to overcome its

shortcomings have, if not eliminated it completely from their system, used it as a backup

or redundant sensor such as Langer [14], Harper [5], and Foessel [15]. Even though

stereo vision is most like the human eyes there are some very big drawbacks to its use.

The biggest is the need for good light. Without good light to illuminate the field of view

obstacles can be badly obscured or lost. This is an important problem for an autonomous

farm tractor. When the farmer’s day often starts before daylight and ends well after dark

the sensor needs to function reliably in any light conditions. This brings out the

difference between stereo vision and human eyesight, the ability to adjust on the fly to

different lighting. Another problem with stereo vision is that the computation costs are

extreme, making it too slow for real-time systems as concluded by Langer [14]. The

other big problem with stereo vision is how to determine what is background and what is

Page 29: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

19

an obstacle. The background is often the same color as the plants Harper [5] wants to

detect, making cameras useless. The human brain has been trained to recognize so many

different patterns and color schemes that instantly we can decipher the data from our eyes

and determine backgrounds and obstacles. The computing power required for this is far

beyond our capabilities. If the computer brain can be trained to recognize background,

then a determination of where obstacles are on that background can be accomplished.

Howard [6] used visual data obtained from cameras to build maps. To do this he assumes

the floor is a constant color, thus making the distinction between passable and impassable

areas of the map. Navigation was accomplished by [7] placing patterns on the floors.

These patterns are easily deciphered from the camera and the robot then travels along the

pattern. Soto [13] used 5 color cameras for surveillance outdoors. He assumes that all

background is grass and is therefore green. This means anything not green in considered

an intruder. Another big problem with stereo vision is dust, rain, and snow. Foessel [15]

described the problems that cameras have in blowing snow. His research was conducted

in polar environments were there is an abundance of blowing snow. Corke [16]

described the problems associated with cameras and dust typical in mining environments.

To the sensor these conditions appear as obstacles when they are not. Again human eyes

adjust to these problems and filter them out. Indoor environments, where lighting and

coloring can be controlled, seem the best place to use stereo vision. The farming

environment offers too many challenges for stereo vision to be used effectively as an

obstacle detection sensor.

Page 30: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

20

3.1.2 Ultrasonic Sensors

A second type of detection sensor is the ultrasonic sensor, or sonar. Sonar is the

most widely used sensor for obstacle detection because it is cheap and simple to operate.

Their use ranges from the robot hobbyist to the serious researcher. In [3] and [11] sonar

is used in vehicle localization and navigation respectively. In [13], [17], [18], and [20]

sonar is used for obstacle detection. Borenstein [21-28] used a sonar ring around his robot

for obstacle detection, which allows him to develop an obstacle avoidance algorithm.

Harper [5] states that “sonar works by blanketing a target with ultrasonic energy; the

resultant echo contains information about the geometric structure of the surface, in

particular the relative depth information.” Unfortunately, a key drawback of sonar is one

sensor for one distance reading; that is, in order to obtain an adequate picture of the

environment around the vehicle many sensors must be used together. It is not uncommon

to have several rings of sonar sensors on omni-directional vehicles because the vehicle

can travel in any direction at any time, as the case with Borenstein [22], [23], [26]. The

number of sensors required for an adequate field of view is not the only drawback to

sonar. Borenstein [28] explains that “Even though ultrasonic ranging devices play a

substantial role in many robotics applications, only a few researchers seem to pay

attention to (or care to mention) their limitations.” He goes on to explain that for

ultrasonic sensors to be effective they must be as perpendicular to the target as possible to

receive correct range data. This is because the reflected sound energy will not be

deflected towards the sensors if the two are not perpendicular to each other. Borenstein

Page 31: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

21

[25] goes on to list three reasons for ultrasonic sensors being poor sensors when

accuracy is required. These reasons are:

1 – Poor directionality that limits the accuracy in determination of the spatialposition of an edge to 10-50 cm, depending on the distance to the obstacle andthe angle between the obstacle surface and the acoustic beam.

2 – Frequent misreadings that are caused by either ultrasonic noise from externalsources or stray reflections from neighboring sensors (“crosstalk”).Misreadings cannot always be filtered out and they cause the algorithm to“see” nonexistent edges.

3 – Specular reflections that occur when the angles between the wave front andthe normal to a smooth surface is too large. In this case the surface reflectsthe incoming ultra-sound waves away from the sensor, and the obstacle iseither not detected at all, or (since only part of the surface is detected) “seen”much smaller than it is in reality.

Perhaps the biggest limitation of sonar for the tractor application is the limited range of

sight. Given a one-ton tractor traveling at 5-8 mph, when the sonar detected an obstacle

it would already be too late to begin safe avoidance. However, despite all the limitations

of ultrasonic sensors, the technique can still have a good use. Sonar is a good safety net

sensor. If the other detection sensors miss an obstacle, or for some reason the vehicle

gets close enough for sonar to be triggered then a halt state can be executed and the

vehicle stops immediately.

3.1.3 Scanning Laser

A third type of detection sensor is a scanning laser. Scanning lasers use a laser

beam reflected off a rotating mirror. The beam reflects off the mirror and out to a target

and then is returned to the sensor for range calculations. Two main types of scanning

lasers are used. The first emits a continuous beam and from the return of that beam range

Page 32: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

22

data are calculated. A laser of this type, provided by Acuity Research Inc., was used

by Torrie in a different CSOIS project [20]. The Acuity Research laser is a class 1 laser

and is not recommended because it is not eye safe. The second type is a pulsed laser that

sends out many laser pulses and averages the range data on each pulse to determine the

range to an object. This type of laser is considered a class 3 and is better because it is eye

safe. Another advantage to the pulsed laser is that measurement errors can be filtered out

better than continuous beam lasers. Scanning lasers, as they are called, are starting to

make their presence known in autonomous vehicle research. Bailey [4] uses laser

scanners to determine the position of his robots. He knows the exact location of all

obstacles and by using the laser scanner can accurately calculate the robots position based

on distance to the obstacles and angle with respect to the obstacles. Apostolopoulos [10]

uses scanning lasers along with cameras to find meteorites in the Antarctic. In [11], [16],

and [20] scanning lasers are used to detect obstacles, which in turn aids in safe

navigation. Scanning lasers give better results for range data with far less computational

constraints compared to camera detection and the resolution is considerably better than

ultrasonic sensors. The angular resolution on a Sick Optic Inc Laser Measurement

System (LMS) laser range finder can be as small as 0.25° with a field window of 180°.

Much of the research that places autonomous vehicles outdoors has moved away from

stereo vision to scanning lasers [16, 20] because no matter the amount of light, range data

are available. The scanning laser does have its drawbacks. Most notable is that the scans

are planar, which means that if an obstacle is above or below the scanning plane then

nothing is detected. The sensors also suffer from dust, rain, and blowing snow, causing

Page 33: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

23

false readings as was noted by Foessel [15], who operated his robots in polar

environments.

3.1.4 3D Scanning Lasers

3D scanning lasers make up a fourth type of detection sensor. The reason they are

classified differently than 2D scanning lasers is the jump in price and complexity.

Langer [14] describes a 3D-scanning laser his company has developed and how it might

be used. On the surface a 3D scanning laser looks attractive, but for real-time systems it

is unfeasible. Langer states the time required to scan 8000 pixels is 80 seconds. The

other draw back is the price of a unit. The author priced out another 3D scanning unit

with the same specifications as [14] and it was over $150,000.00, too much to put on any

farm vehicle even if the scanning speed was real-time.

3.1.5 Millimeter Wave Radar

The final type of detection sensor we consider is millimeter wave radar. This

sensor also suffers from a high price tag and not many researchers have the means to use

it. One place it has been used is at Carnegie Mellon University, on their Antarctic rover.

Foessel [15] explains why the harsh polar environment renders stereo vision and scanning

lasers useless: the blowing snow causes to many false objects. When millimeter wave

radar was tested in this environment blowing snow had “almost no effect on millimeter-

wave range estimation.” The narrow beam associated with radar also allows for good

angular resolution. When properly gimbaled the radar can collect a 3D image of the

environment in the path of the autonomous vehicle.

Page 34: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

24

3.2 Comparsion of Sensors

We have examined five different types of obstacle detection sensors and have

discussed some of their advantages and disadvantages separately. Now we will compare

each sensor side-by-side for our application. Given the farming environment we can set

five criteria for selecting the best obstacle detection sensor: operation in any weather,

operation in any light, detection out to at least 15 meters (50 feet), fast response time, and

cost that is significantly less than that of the tractor. Ideally the best sensor will meet our

five criteria, but if this is not possible the one that comes closest will be selected.

3.2.1 Weather

The farming environment brings out the whole spectrum where weather is

concerned. The sensor must be able to handle dust, rain, and snow, all of which can be

blowing and swirling at any given time. Because of this the camera, sonar, and scanning

laser can give false readings. The only suitable detection sensor is the millimeter wave

radar.

3.2.2 Light

On the farm the day starts in the darkness of the morning and then gradually the

light gets more intense as the day goes on until the noon day peak. At this point the light

recedes much as it started as the day winds down. If, however, the weather changes

throughout the day the amount of light may also change. This means that our detection

sensor must operate in any light. From the start this excludes the CCD cameras. They

function on ambient light and any changes in the lighting can affect the interpretation of

Page 35: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

25

their data. Sonar, scanning laser, and millimeter wave radar are relatively unaffected

by changing light conditions and would be good choices for dealing with changing light

conditions.

3.2.3 Detection Distance

For a tractor that travels at 8.8 ft/sec it only takes 3.4 seconds for the tractor to

travel 30 feet. The obstacle detection sensor must determine the presence of an obstacle

and the avoidance algorithm must calculate an avoidance heading in under 3.4 seconds,

preferably well under 3.4 seconds. This means that the detection sensor must be able to

detect obstacles soon enough that avoidance is performed safely. With this limitation we

determined that the obstacle detection sensor used on the tractor must have a maximum

detectable range of at least 15 meters (50 feet). This gives plenty of time for obstacle

identification and avoidance heading calculations. This requirement excludes most

ultrasonic sensors, which have a maximum range of about 5 meters. CCD cameras,

scanning laser, 3D scanning laser, and millimeter wave radar all have maximum

detectable ranges of at least 15 meters or more.

3.2.4 Response Time

In order for a detection sensor to be adequate the time required to see and obstacle

and cause the tractor respond is critical. In the case of the 3D-scanning laser the time

required to scan one image in front of the vehicle is 80 seconds, which is obviously

unacceptable for a real-time system. Sonar can determine fast enough if an obstacle is

present, but sonar can only see a short distance. That is, the tractor will already be too

Page 36: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

26

close to an obstacle before the sonar can sense an obstacle. A CCD camera’s response

time depends on the image processing speeds and capabilities. With faster computers this

is not a problem and camera’s response time is fast enough to detect obstacles at a safe

distance. 2D-scanning lasers have a very fast response time, on the order of 17 ms to 1 s

depending on the resolution and the speed that data are being output. This allows

scanning lasers to detect obstacles at a safe distance, allowing for proper obstacle

detection and avoidance. Millimeter wave radar also has a fast enough response time.

As with cameras, the higher speed computers allow for fast computation of the radar

signal. Radar can detect obstacles at safe distances.

3.2.5 Cost

For most research projects cost of sensors may not be a prime issue, but with the

autonomous tractor tested in this project there is a desire to keep cost down because of

possible future production considerations. If the cheapest detection sensor were the only

issue then hands down the sonar or CCD cameras would be the choice. The 2D scanning

laser costs significantly more than the sonar and the CCD camera, but it would not break

the bank to put it on a production type tractor. In fact, looking at our requirement of

costing significantly less than the cost of the tractor the scanning laser is acceptable. The

3D scanning laser, on the other hand, costs as much if not more than most tractors and

would not be feasible. The millimeter wave radar also has a high cost, while not as much

as the 3D-scanning laser, it is still significant enough to warrant the choice of a cheaper

technology.

Page 37: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

27

3.2.6 Summary

After examining the five criteria we must settle for the optimum sensor. Because

radar is the only sensor that works well in all weather conditions it would be a natural

choice. However due to the high cost and the relative newness of the technology we

opted not to use radar, hoping that within a few years the cost and technology would be

more practical. Based on the remaining four criteria the 2D scanning laser was the most

practical choice. The amount of ambient light has little effect on the sensor, the

maximum detection range is 30 meters, the response time is adequate, and the cost for a

unit is justifiable. The comparisons of each sensor with the five criteria are shown

graphically in Fig. 3.1.

In this section we have discussed five different sensors used for obstacle detection. We

investigated research that uses each of the five different sensors and we listed their

advantages and disadvantages. Finally we set forth four criteria that the

Operationin any

weather

Operationin anylight

Detectionof at least15 meters

Fastresponse

time

Cost lessthan

tractorCCD

Cameras

Ultrasonic

ScanningLaser

3D ScanningLaser

MillimeterWave Radar

x x x

x x x

x x x x

x x

x x x x

Fig. 3.1: Comparision of obstacle detection sensors.

Page 38: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

28

detection sensor needed to meet in order to be used on the tractor. With each sensor

evaluated against these five criteria we decided that a 2D-laser scanner was the optimum

sensor at the present time for the autonomous tractor. The next section discusses in detail

the laser scanner that was chosen and used on the project.

3.3 Detection Sensor Used on Tractor

To detect obstacles on the tractor project, an LMS 220-laser range finder was

mounted on the front of the tractor (see Fig. 3.2). This sensor, from Sick Optic Inc.,

scans obstacles out to 30 meters in a 180° planar scan window (see Appendix A for

detailed specifications). The laser measures distance in meters with a range resolution of

0.01 m, from 0° to 180°, in increments of 0.5°. The angle resolution is configurable with

possible resolutions being measurements every 0.25°, 0.5°, or 1.0°. With a resolution of

0.5° there are 361 distance measurements that the laser reports during continuous output

mode. Each measurement requires 2 bytes of information, with the low byte being

reported first and the high byte being reported last. This makes a total of 722 data bytes

that are reported by the laser to indicate the distance measured every 0.5°. In addition to

the 722 bytes of distance data there are also 10 additional bytes of data, 7 before the

distance data and 3 after the distance data, making the total number of bytes in one report

packet 732. In Section 5.1 we explain how this data is used for avoidance.

In Section 3.2.5 we selected a sensor based on four criteria. The only criterion

that the scanning laser did not meet was the problem of false readings due to inclement

Page 39: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

29

Sick Inc.Laser RangeFinder

Obstacle Sensor

Fig. 3.2: Laser range finder mounted on tractor.

weather. In fact we saw that after two months of driving in a dirt field the amount of dust

created by the tractor tires while driving caused many false readings from the laser. To

remedy this problem, all range data closer than seven feet were ignored. This was not a

desirable solution, but without another detection sensor to back up the laser range finder

it was the only practical solution.

In this chapter we have discussed obstacle detection and how it relates to the

project. We first indicated why an obstacle detection sensor was so important to

autonomous vehicle research. Next we looked at the detection sensors being used by

researchers in the field of autonomous vehicles and described the basis for the selection

of a sensor for the autonomous tractor. Finally we briefly described the obstacle

detection sensor that was used on the tractor. Now we can focus our attention on how the

Page 40: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

30

vehicle will react to measurements of the surrounding environment that are obtained by

the detection sensor.

Page 41: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

31

Chapter 4

Obstacle Avoidance

In this chapter we present two different approaches to obstacle avoidance: global and

local avoidance. We then examine the desired attributes of the path planner/obstacle

avoidance algorithm used for the autonomous tractor project. We look at several local

obstacle avoidance techniques described in research literature, one of which is selected as

the model for the avoidance algorithm ultimately used on the tractor.

4.1 Global and Local Obstacle Avoidance

In autonomous vehicle research two levels of planning occur. One level of

planning, called global planning, examines the whole world an autonomous vehicle can

travel in and plans paths from one point to the next dependent upon this world. Obstacle

avoidance on a global level is accomplished by routing all paths away from potential

obstacles. If the vehicle encounters an unexpected obstacle during a mission then the

global planner reexamines the whole map with the added obstacle and adjusts either a

portion of the affected path or the rest of the path based on the newer map data.

Local or reactive planning plans a short path for the vehicle to traverse based only

on the environment surrounding the vehicle, typically using only the detection sensor

output. Local or reactive avoidance is aware of the desired path of travel, but due to path

obstructions, plots a path around the obstruction until the vehicle can safely reattach itself

to the desired path of travel.

Page 42: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

32

An example of global and local obstacle avoidance can be illustrated in my drive

to work. Each morning I drive to work my route is the same. If one day I drive to work

and a child jumps in front of my car I do not drive a different route to work, instead I

swerve around the child and continue on my way in the usual fashion. This is analogous

to reactive obstacle avoidance. However, one day while driving to work I come across a

roadblock. Now I choose a different road, or set of roads, that will get me to work. This

is analogous to global obstacle avoidance.

4.2 Vision of Obstacle Avoidance

At the start of the project the project team examined different approaches to

obstacle avoidance. We knew we wanted the knowledge of known obstacles in the

global map, but we wanted avoidance to be performed by a reactive algorithm. To

accomplish this we decided that the path planner would contain a list of positions for all

known obstacles in the map. These obstacles would determine the paths to be planned;

that is, no known obstacle would block a planned path. If an unknown obstacle were

encountered during the mission then a reactive algorithm would use the detection sensor

data as well as the list of known obstacles to decide the best route around the obstacle or

obstacles and back onto the desired path of travel.

Once the relationship between the path planner and the avoidance algorithm was

clarified we decided on five criteria that the reactive obstacle avoidance algorithm needed

in order to be married into our design architecture:

1 – The algorithm must compute vehicle-heading changes based on a map of theobstacles in the tractor’s environment. This map would represent all obstacles

Page 43: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

33

seen by the detection sensor as well as known obstacles that may not beseen by the detection sensor.

2 – The algorithm must consider the vehicle to have length and width, not simplybe a point source.

3 – To avoid excessive computations the algorithm cannot examine the entire mapwhere the tractor may travel, but must examine only the portion of the map inthe vicinity of the tractor.

4 – The algorithm must halt the tractor if an obstacle is too close to the vehicle orthe algorithm cannot compute a safe direction of travel around the obstacle.

5 – The algorithm must handle dynamic, or moving obstacles.

This list served as a guide when evaluating the many different obstacle avoidance

algorithms that we found in various literatures. If an avoidance algorithm were lacking in

any of the five requirements then it would not be considered for use on the tractor. The

next section compares several different avoidance algorithms against the five

requirements and then describes in detail the algorithm chosen as the model for the

algorithm that was eventually used on the tractor.

4.3 Obstacle Avoidance Used in Research

In the previous section we outlined five criteria that our obstacle avoidance

algorithm needed to have to be compatible with our design architecture. We did not want

to completely reinvent the wheel so we did an extensive literature search on different

obstacle avoidance algorithms. This section will examine several of the predominate

algorithms being used today. Several of the references that will be discussed do not

outline a formal algorithm, but by reading the material a general understanding of the

algorithm is achieved. These papers usually refer to other works that explain in greater

Page 44: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

34

detail the algorithm being used. The remainder of the section is organized as follows.

Subsection 4.3.1 examines wall-following techniques of avoidance. Subsection 4.3.2

discusses “black hole” techniques. Subsection 4.3.3 presents the idea of path-computing

techniques. Subsection 4.3.4 considers potential field techniques. Finally Subsection

4.3.5 describes histogram techniques.

4.3.1 Wall-Following

Wall-following avoidance is a technique of following the contours of an obstacle until the

obstacle no longer blocks the desired path to the goal. Kamon [29] describes an

algorithm used to navigate a robot to a goal location in a real world environment. The

algorithm uses two basic modes of motion: motion towards the target and obstacle-

boundary following [29]. The robot proceeds to the goal until an obstacle is

encountered. Once the obstacle is reached it then moves along the obstacle boundary

until it determines that it is close enough to the goal to break away from the obstacle.

The robot then proceeds towards the goal until another obstacle is encountered. Fig. 4.1

gives an illustration of what this algorithm might do, and is similar to pictures found in

[29]. Kamon explains that in their experiments the algorithm worked quite well in most

situations, producing minimal path distances around obstacles compared with other wall-

following algorithms.

Even though wall-following is an effective means of obstacle avoidance, when

graded against our five criteria it is obviously not the technique we want to use. This

algorithm does not compute vehicle heading based on a map of the obstacle environment,

as the first criteria states; rather, heading is determined as the detection sensor senses the

Page 45: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

35

Start

Goal

Wall Following Technique

Fig. 4.1: Example of wall following technique as presented in [30].

contour of the obstacle. This effectively eliminates known obstacles that may not be

sensed by the detection sensor. This is a problem if the vehicle senses more than one

obstacle. The algorithm assumes the vehicle is a point source, which does not follow our

second criterion, although this could be changed in the algorithm such that non-point

source vehicles could be used. The third criterion does not factor into this evaluation

because the algorithm uses no map for obstacle avoidance. This is impossible to change

because the algorithm is truly a reactive algorithm, it only reacts to what the detection

sensor sees and nothing else. The fourth criterion, stating that the vehicle must halt if

obstacles are too close or if the obstacle is impassible, is not realized in this algorithm.

The very nature of the algorithm dictates that the vehicle positions itself very close to the

obstacle in order to follow the contours properly. The algorithm could be made to stop

Page 46: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

36

however, if the obstacle is impassible by forcing the vehicle to stop if it has traveled

too far off the desired path. The final criterion, handling dynamic obstacles, is also not

realizable. Kamon states “the algorithm navigates a point robot in a planar unknown

environment populated by stationary obstacles with arbitrary shape.”

4.3.2 Black Hole

The black hole technique of obstacle avoidance is a way of examining all of the

obstacles in front of the vehicle and heading towards the largest opening, or hole.

Bischoff [9] discusses how the black hole technique can be applied to humanoid service

robots. He explains that “wandering around is a basic behavior that can be used by a

mobile robot to navigate, explore and map unknown working environments.” When the

robot wanders it has no knowledge of its surroundings and must perform obstacle

avoidance. Bischoff explains that the onboard sensors segment the images around the

robot into obstacle-free and occupied areas and

“After having segmented the image, the robot can pan its camerahead towards those obstacle-free regions that appear to be thelargest. If such a region is classified as large enough, the steeringangle of both wheels is set to half of the camera’s head pan angleand the robot moves forward while continuously detecting,tracking and fixating this area.”

In this wandering state the robot travels and avoids obstacles until a user stops the state.

If the robot becomes stuck in a corner or dead-end hallway then it backtracks until it can

wander freely again.

Chahl [8] describes a similar black hole avoidance algorithm. The algorithm is

different than Bischoff’s but the general principle of finding the most open tunnel in front

Page 47: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

37

of the autonomous vehicle is the same. In the case of Chahl an autonomous airplane is

used. Here the vehicle can sense obstacles in three dimensions and has three constraints.

The constraints are:

1 – continual progress towards a goal2 – avoiding collisions3 – maintaining a low altitude

With these constraints the goal is to find a free path in the panoramic range image

provided by the on-board sensors. A free path is described as “at least a tunnel of given

minimum diameter. The diameter of the tunnel is determined by the safe minimum

distances between the vehicle and the nearest obstacle. These parameters would be

determined by the wingspan and controllability of the aircraft.” This algorithm looks for

black holes, but unlike [9] the black hole that gets the airplane closest to the target is

taken.

Black hole obstacle avoidance is an effective means of using obstacle data in front

of the vehicle to determine the best course to take. When compared with our five criteria

for possible use on the tractor all but one of the criteria are acceptable. The black hole

approach does not meet the first criteria. The safest black hole to travel is calculated with

only the current detection sensor readings and therefore known obstacles not seen by the

detection sensor would go unnoticed. Even in [9] where the environment is mapped and

obstacle-free regions are established the robot must still examine each obstacle free zone

with its sensors to determine passability. In [9] there is a possibility of placing known

obstacles in to the map before obstacle-free regions are established. The second criterion

however is met because for any black hole to be passable it must be at least the

Page 48: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

38

dimension of the vehicle. The third criterion is also met because computations for

avoidance heading are made only on the obstacles within the immediate environment of

the vehicle. The fourth criterion of stopping the vehicle if obstacles are too close or if

there is no possible path of travel is also achievable in this type of avoidance. Finally, the

ability to handle dynamic obstacles is achievable because black holes are determined for

each sensor scan of the surrounding environments. If an obstacle moves then the next

sensor scan will indicate different black hole locations. An algorithm that is similar to

this approach is discussed in section 4.3.5 except black holes are found after obstacles are

placed into a map of the environment.

4.3.3 Path Computing

Path computing techniques limit the possible directions of travel by the robot to

certain number of headings. Usually three directions of travel are possible: right, left,

and straight-ahead. Every time the vehicle travels towards the goal these three directions

are determined to be free or blocked. Based upon the free or blocked nature of the paths

the vehicle navigates toward the goal. Fujimori [30] describes an adaptive obstacle

avoidance algorithm that assumes the following about the robot:

1 – It moves only in the forward direction2 – It turns to the right or left with a minimum rotation radius rmin

3 – Its velocity is constant except near the goal

Three distance sensors are used to determine if each of the three directions of travel is

free or blocked and the vehicle navigates safely to the goal.

Schwartz in [31] describes another variation on this technique when he describes

an avoidance algorithm used on a medium sized tracked vehicle called PRIMUS. To

Page 49: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

39

accomplish their avoidance all obstacles in a 60-m X 60-m area are placed in an

obstacle map. The three possible directions of travel are then placed on the map and each

direction is determined to be free or blocked.

The two different examples of path computing obstacle avoidance meet the

requirements discussed at the beginning of this section differently. In [30] vehicle

heading changes are based upon the adaptive algorithm that uses range data from three

sensors. In [31] the three possible directions of travel are determined to be free or

blocked based on overlaying the paths in a map of obstacles surrounding the vehicle.

Thus [31] meets the criterion while [30] does not. The second criterion is also different

for the two methods. In [30] the free or blocked nature of the path is determined based

on the range data from three different sensors. If the sensors were ultrasonic then the

cone of detection might be the width of the vehicle and anything within that cone will be

detected and the vehicle is safe. If a laser range finder were used then this would not be

the case and one of the three directions might appear open when in actuality an obstacle

is close by and choosing that direction to drive the vehicle would result in a collision. In

[31] the possible paths to travel are overlaid on the obstacle map and here the paths must

be the size of the vehicle. In [30] no map is kept of the obstacles therefor the third

criterion is not applicable. In [31] however the third criterion is met because a smaller

obstacle window (smaller than the entire map the vehicle can travel in) is used to

calculate the avoidance heading. The fourth criterion is met in both [30] and [31]

because if all three of the directions of travel are blocked then the vehicle can be halted.

The fifth criterion is also realizable because each time a new avoidance heading is

Page 50: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

40

calculated the most recent obstacle map [31] or the most recent sensor reading [30] is

used.

4.3.4 Potential Fields

The potential field method of obstacle avoidance is used extensively in robotic

research. This method examines repulsion vectors around obstacles and determines the

desired heading and velocity of the autonomous robot needed to get around the obstacle.

Chuang [32] describes this method as “collision avoidance in path planning is achieved

using repulsion between object and obstacle resulting from the Newtonian potential-

based model.” Several examples of potential based obstacle avoidance follow.

Prassler [17] describes MAid, a robotic wheelchair that roams in a railway station. This

method first places all obstacles into a time stamp map. This map is similar to occupancy

grids (discussed in subsection 4.3.5) except that the time an obstacle resided in a

particular cell in the grid is recorded. With this information obstacles can be tracked and

an obstacle vector can be predicted. With this obstacle vector and the heading vector of

the robot a collision cone can be constructed. The collision cone represents the heading

and velocity vectors that would cause collision with the obstacle. The avoidance

maneuver consists of a one-step change in velocity to avoid a future collision within a

given time horizon. The new velocity must be achievable by the moving robot [17]. This

technique was performed very successfully in a moderately crowded railway station in

Germany.

Another robot that is similar to the one described above was designed by Carnegie

Mellon University and used as an automated tour guide in the Smithsonian Museum [18].

Page 51: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

41

As stated by Thrun, this algorithm “generates collision-free motion that maximizes two

criteria: progress towards the goal, and the robot’s velocity. As a result the robot can ship

around smaller obstacles (e.g. a person) without decelerating.”

Chuang [32] and Soto [13] describe two different algorithms for computing the

potential fields in the environment around autonomous vehicles. Each of these

algorithms have been simulated and tested on robots in real world environments with

good success.

The four papers described above all base their algorithms on the potential field

method. Even though all four algorithms are different the results are the same and this

allows us to compare the potential field method against our five criteria. The first

criterion is satisfied because the avoidance heading of the vehicle is calculated based on

obstacles placed in a map. The second criterion is also met because the potential fields

are computed based on the vehicle size. The computations for the avoidance heading are

made based on the portion of the map in the immediate environment of the vehicle, thus

satisfying the third criterion. The fourth criterion is met when the vehicle is stopped if

the potential fields indicate no collision-free path is possible. Finally the fifth criterion is

met because the algorithm can handle dynamic obstacles, as is the case for the tour guide

robot and the robot wandering in the railway station.

4.3.5 Histogram

Having discussed each of the different obstacle avoidance techniques we see that

there are desired aspects of each approach. The histogram approach to obstacle

avoidance contains many of the different aspects in one algorithm. Johann Borenstein

Page 52: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

42

[22-27] developed an obstacle avoidance algorithm based on histograms. Each of

these papers explains the progression of an algorithm that he finally calls the vector field

histogram plus (VFH+) algorithm in [22]. This VFH+ algorithm served as the model for

the algorithm that was developed for the tractor, as explained in Chapter 5. Here we give

a brief explanation of the algorithm found in [22]. For a more in depth understanding of

the algorithm the reader is encouraged to read [22-27].

The VFH+ algorithm consists of four stages, with each stage reducing the amount

of obstacle data to the point of calculating an avoidance-heading vector. The four stages

are the primary polar histogram, the binary polar histogram, the masked polar histogram,

and the selection of the steering direction.

Before any data reduction takes place a preliminary understanding of the algorithm is

needed. The world where the vehicle travels is resolved into a map of N x M cells. The

coordinates of this map are always fixed and the vehicle travels with respect to the map.

This map is called the certainty grid and values associated with each cell represent the

probability of obstacles residing at the cell locations in the grid. In Borenstein’s

algorithm [22-27] when the vehicle mission is first started each cell in the certainty grid

is initialized to zero, indicating the probability that obstacles reside in any particular cell

is impossible. In our implementation described in Chapter 5 all cells in the certainty grid

are initialized to zero except cells that contain known obstacles. Cells that contain known

obstacles are set to the highest possible certainty value, indicating that an obstacle resides

at that cell location (because of the prior knowledge of the path planner). As the vehicle

progresses in its mission the obstacle detection sensor detects obstacles at various grid

Page 53: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

43

locations. Each time an obstacle is sensed the value of the corresponding grid location

is updated by one. Thus the higher the certainty cell number the higher the probability

that an obstacle resides at that cell location. As per the third criterion, avoiding excess

calculations by using only a portion of the map in the immediate vicinity of the vehicle,

the VFH+ algorithm is compliant. Another map, called the active window, overlays that

portion of the certainty grid in the immediate vicinity of the vehicle. It is from all the

cells of the certainty grid overlaid by the active window that the avoidance heading in

calculated. Fig. 4.2 illustrates the certainty grid with an active window and several

known and unknown obstacles that have been placed into the grid. The avoidance

heading is calculated by reducing the obstacle data until the masked polar

50

50

50

50

50

50

50

50

50

50

50

50

Known Obstacle

10

25

50

0

0

35

0

0

25

0

0

10

Unknown Obstacle

50

50

50

0

0

0

0

0

0

0

0

0

Active Window Laser Scan Area

0

0

0

0

0

0

0

0

0

0

0

0

Unknown Obstacle

50

50

50

50

50

50

50

50

50

50

50

50

Fig. 4.2: Example of certainty grid.

Page 54: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

44

histogram is calculated. The masked polar histogram indicates the driving directions

that are opened and those that are blocked. The driving direction closest to the goal will

be selected and the vehicle will steer in that direction. This is illustrated in Fig. 4.3.

From the figure we see the vehicle is trying to track to the desired path of travel, but an

obstacle resides to its left. The masked polar histogram indicates that the vehicle cannot

choose a steering direction between 180° and 270°, but since the path direction is less

than 180° then vehicle chooses the steering direction that will return it to the path.

With this preliminary understanding of the VFH+ algorithm we can discuss the four data

reduction stages.

Point Being Tracked

Avoidance Path

0°°°°

90°°°°

180°°°°

270°°°°

Fig. 4.3: Vehicle tracking a point on desired path.

Page 55: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

45

4.3.5.1 The Primary Polar Histogram

As stated by Borenstein, “The first data reduction stage maps the active region of

certainty grid onto the primary polar histogram.” The active window (Aij) is a circle of

diameter Ws with the vehicle always at its center. The content of each cell in the active

window contains a vector direction and magnitude from the cell Aij to the vehicle center

point (VCP). The vector is given by:

−−

= −

0

01tanxx

yyB

i

jij

where:

x0, y0: Present VCP coordinates.

xi, yj: Active cell, Aij, coordinates.

The magnitude is given by:

( )22ijijij dacm −=

where:

cij: Certainty value of active cell Aij.

dij: Distance from active cell Aij to VCP.

a: A constant based on the diameter of the active circle.

The parameter a is chosen according to:

12

22

=

−− Ws

a

For the vehicle to navigate safely around obstacles a certain distance from the obstacles

must be maintained. Enlarging each cell in the active window by the vehicle radius Rr

and some safe distance Ds (rr+s = Rr + Ds) creates a cushion between the vehicle and

Page 56: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

46

obstacles. Ds is the minimum allowable distance between the vehicle and the obstacle.

Each cell has an enlargement angle vij given by:

= +−

ij

srij d

rv 1sin

The primary polar histogram Hp is created with an angular resolution α so that n = 360º/α

is an integer. In Borenstein’s implementation, α is specified to be 5º, making n = 72 and

causing the discrete angle sector k = α·n. For each sector k, the polar histogram is

created by:

∑ ∈×=

Aji ijijp

k hmH,

'

with:

h'ij = 1 if k∈ [ Bij - vij, Bij + vij]

h'ij = 0 otherwise

It is important to note that the values Bij, a – d2ij, and vij are properties of the active

window and never change; they can be calculated once and stored in a matrix of size Ws

x Ws for quick reference.

4.3.5.2 The Binary Polar Histogram

Once the primary polar histogram has been calculated for each sector k, the binary

polar histogram is easily constructed. The binary polar histogram, for each sector k, is

calculated by using two defined thresholds, τlow and τhigh, the primary polar histogram,

and the following rules:

Page 57: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

47

1, =bikH

0, =bikH

bik

bik HH 1,, −=

i f

i f

o t h e r w i s e

highp

ikH τ>,

lowp

ikH τ>,

A sector k in the binary polar histogram is blocked if it has a value of 1 and is unblocked

if it has a value of 0. For example, if sectors k = 90 to k = 180 have values of 1, then

from 90° to 180° an obstacle is present.

4.3.5.3 The Masked Polar Histogram

The masked polar histogram takes into account the vehicle turning radius when

deciding if an obstacle blocks the vehicle path. The turning radius creates a trajectory

circle, and if an enlarged obstacle cell overlaps the trajectory circle, then all directions

from the obstacle to the backwards direction of motion are blocked. Borenstein, in [22,

Fig. 4] gives an example of blocked directions. Fig. 4.4 is a similar example of blocked

directions, but with a difference in the x and y directions. The figure shows the vehicle

near two enlarged active cells. Assuming that the active cells contain obstacles (labeled

obstacle 1 and obstacle 2), the vehicle must decide which direction to travel so not to

strike the obstacles. When obstacle 2 is enlarged, it overlaps the right trajectory circle

making travel to the right of obstacle 2 impossible. When obstacle 1 is enlarged, it does

not overlap the left trajectory circle; thus travel to the left of obstacle 1 is possible. The

possible directions of travel then become between the two obstacles or to the left of

obstacle 1.

The locations of the right and left trajectory centers presented here are different

than those presented in [22], by a sign change on the ∆y, due to the different x and y

Page 58: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

48

rr+s

rr+s

free

rr+s

rr∆yr

∆xr

dr

∆yl

∆xl

rl

dl

free

blocked

blocked

blockedx

y

θObstacle 1

Obstacle 2

Fig. 4.4: Example of blocked directions.

orientations. The locations of the right and left trajectory centers, relative to the current

vehicle position and orientation are given by:

∆xr = rr•sinθ ∆yr = -rr•cosθ

∆xl = -rl•sinθ ∆yl = rl•cosθ

Thus the masked polar histogram is updated to reflect the right and/or left trajectory

circles being blocked. This is done by finding the distances from each active cell Aij to

the two trajectory centers as given by the formulas:

( ) ( )( ) ( )222

222

jlill

jrirr

AyAxd

AyAxd

−∆+−∆=

−∆+−∆=

An obstacle blocks the vehicle direction to the right if:

( )srrr rrd ++<2

Page 59: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

49

And an obstacle blocks the vehicle direction to the left if:

( )srll rrd ++<2

After checking every cell in the active window with these two conditions, the masked

polar histogram is updated by defining two limit angles, ϕr and ϕl. A limit angle is

defined as the angle from the VCP to the first cell in the active window that is block and

on the trajectory circle. For example, in Fig. 4.4 the binary polar histogram would

indicate that the angles 150° to 120° and 45° to 25° are blocked. Because the left

trajectory circle is not blocked ϕl is set to the heading minus 180°. Because the right

trajectory circle is blocked, ϕr is set to 50°. In [22] a quick and efficient algorithm for

finding these two limit angles is given. The masked polar histogram can now be

constructed using ϕr, ϕl, and the binary polar histogram according to the following rules:

0, =mikH 0, =b

ikH

1=mkH

i f

o t h e r w i s e

[ ] [ ]{ }θϕθϕ ,,, lrk ∈

a n d

An example of several masked polar histograms that occur as a vehicle travels a path is

given in the simulation in Chapter 6, Fig. 6.1.

4.3.5.4 Steering Direction

The masked polar histogram indicates the directions the vehicle is free to travel.

A decision on the best direction of travel must be made based on the direction to the

desired path. In [5], a single goal was assumed; for example, the goal was to get to a

point across a room. For our purposes, the goal is a point on the desired path of travel.

Page 60: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

50

While the algorithm for selecting a steering direction is the same in [5], our method

uses a point on the desired path that is a set distance (dpoint) ahead of the current vehicle

position as the goal. This allows us to track to the path passing through an obstacle while

not striking the obstacle. This is the most important feature of our implementation of the

VFH+ algorithm. For our purposes we need to travel as much of the preplanned path as

possible. Fig. 4.3 is an illustration of the vehicle tracking to a point on the desired path

while avoiding an obstacle.

Candidate directions of travel are chosen by finding the borders of all openings in

the masked polar histogram (sectors with values of 0). Then a cost function is calculated

by taking into account the difference between the candidate direction and the goal

direction. The candidate direction that has the lowest cost is selected as the steering

direction. In Fig. 4.3 the goal direction lies between the borders of the obstacle so the

vehicle’s candidate direction will be the goal direction.

The first step in finding candidate directions of travel is to find the left and right

borders, kl and kr, of all valleys in the masked polar histogram and determining the goal

direction kt. Valleys are considered any possible steering direction in the masked polar

histogram that has a value of zero. For a valley to be considered, it must be greater than

or equal to some smax number of sectors. For all valleys greater than or equal to smax there

are three possible candidate directions determined by the following rules:

cr = kr + smax/2

cl = kl - smax/2

ct = kt if kt∈ [cr, cl]

Page 61: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

51

The candidate directions either lead the vehicle to the right or left side of obstacles or

towards the goal. All candidate directions are then put into a cost function as follows:

( ) ( )tkccg ,∆=

where:

( ) { }kcckcccccc +−−−−=∆ 21212121 ,,min,

The candidate direction that has the lowest cost is selected as the direction of travel for

the vehicle.

In this chapter we discussed obstacle avoidance and its place in autonomous

vehicle research. We first defined the difference between global and reactive obstacle

avoidance, where the path planner, discussed in Section 2.3, performs global avoidance.

We next made a list of requirements that the obstacle avoidance algorithm had to exhibit

in order to be compatible with our designed architecture. We then used this list to

compare five different techniques of obstacle avoidance used in research today. Finally,

we described the algorithm by Borenstein that served as the model for the avoidance

algorithm that was used on the tractor. Now that we have discussed obstacle detection

and avoidance as it pertains to autonomous vehicle research, it is time to address the

detection sensor and avoidance algorithm that was used on the tractor in greater detail.

Page 62: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

52

Chapter 5

Obstacle Detection and Avoidance

Prior to this point a lengthy discussion of obstacle detection sensors and obstacle

avoidance algorithms was given. Now we come to the discussion of what was actually

used on the autonomous tractor project. This chapter is organized as follows. We first

look at the obstacle detection and avoidance system as a whole to understand how

everything meshed together. Next an explanation of the how the obstacle data was

retrieved from the obstacle detection sensor is given. This includes the commands sent to

the LMS laser range finder as well as the response packets from the LMS. Also included

in the discussion of obstacle detection is the determination of obstacle locations. We next

discuss the obstacle filter and how this filter merges obstacle data from the LMS with

obstacle data from the path planner. Also included in the discussion of the filter is how

dynamic, or moving, obstacles are handled. Finally we conclude the section with a

detailed description of the avoidance algorithm.

5.1 Obstacle Detection and Avoidance Data Flow

The process of obstacle detection and avoidance is illustrated in Fig. 5.1. The

process starts by reading data from the laser range finder through an RS232

communication port. After the data has been collected, individual obstacles, along with

their locations, are determined relative to the vehicle’s position. If any obstacle is

determined to be too close to the vehicle then the vehicle stops immediately. If none of

the obstacles are too close to the vehicle then they are copied to the obstacle filter. The

Page 63: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

53

Obstacle Detection and AvoidanceReadLaser

DetermineObstacleLocation

Filter OutKnown

Obstacles

STOP

UpdateObstacle

Map

DynamicAvoidanceHeading

Report ToPlanner

PathObstructed?

TooClose?

Steer toHeading

Yes

No

YesNo

Fig. 5.1: Obstacle detection and avoidance flow diagram.

obstacle filter determines if an obstacle was known previously or not. If it was not

known previously then the obstacle map is updated. The obstacle map contains all the

known and unknown obstacle locations and if an obstacle blocks the path of the vehicle

then the vehicle must avoid the obstacle. To avoid the obstacle an algorithm was

developed that determines the steering angle needed for the vehicle to avoid obstacles

and return to the desired path of travel as quickly as possible. The desired steering angle

is sent to the steering control system.

5.2 Obstacle Detection

Table 5.1 gives a detailed description of the data being reported by the Sick LMS

laser range finder during continuous output mode and Fig. 5.2 shows a graphical

Page 64: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

54

Table 5.1: LMS Continuous Data Report Packet.

Byte # Hex Value Description0 0x02 Start of new distance measurement packet1 0x80 Address of LMS reporting distances plus 0x80 (In our

case we had only one LMS at an address of 0x00)2 0xD6 Low byte3 0x02 High byte (Represents the number of

remaining bytes in packet, not including checksum)4 0xB0 LMS command to send distance measurements5 0x69 Low byte6 0x01 High byte (Represents the number of

distance measurements. 0x0169 = 361)7 0x?? Low byte8 0x?? High byte (Represents the distance

measurement for 0 degrees9 0x?? Low byte10 0x?? High byte (Represents the distance

measurement for 0.5 degrees:: Distance Measurements for 1.0 to 179.5 degrees:

727 0x?? Low byte728 0x?? High byte (Represents the distance

measurement for 180.0 degrees729 0x?? Status byte (Packet is good if 0x10 is reported)730 0x?? Low byte731 0x?? High byte (Represents the checksum)

representation of the output data by showing an obstacle between 10° and 45° and

another obstacle between 135° and 150°. Before continuous data can be obtained from

the LMS a reset string must be sent to the LMS and the proper response must be returned

by the LMS. Then the command to start reporting continuous data must be sent to the

LMS and again the proper response must be returned by the LMS. Once this sequence of

commands and responses has successfully been completed the response shown in Table

5.1 will be repeated continually until program termination/power loss, or until the LMS is

Page 65: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

55

reset. The reset command to the LMS is given in Table 5.2. The continuous output

mode request command to the LMS is given in Table 5.3.

Once the distance measurements are being read from the LMS then the task is to

determine where obstacles are, relative to current vehicle position. As the distance

measurements are read from the LMS they are converted from meters to feet (we

understand feet better than meters) and then placed in an array. The first value in the

array represents the distance measurement for 0° and the last value represents the

distance measurement at 180°. The array containing the distance measurements is

examined element by element once, from the first to the last. Each element is determined

to be either a start point, end point (if start point has already been determined), part of an

obstacle that has a start point but has not reached the end point, or not belonging to an

obstacle if no start point has been found. The following list shows how a distance

measurement is determined to be one of the four possibilities:

Fig. 5.2: LMS output with 0.5° resolution.

Page 66: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

56

Table 5.2: PC to LMS Reset Packet.

PC to LMS Reset Packet

Byte # Hex Value Description0 0x02 Start of new distance measurement packet1 0x80 Address of LMS to reset (In our case we had only

one LMS at an address of 0x00)2 0x01 Low byte3 0x00 High byte (Represents the number of

remaining bytes in packet, not including checksum)4 0x10 Command to reset the LMS5 0x69 Low byte6 0x01 High byte (Represents the checksum)

Table 5.3: PC to LMS Request for Continuous Data.

PC to LMS Request for Continuous DataByte # Hex Value Description

0 0x02 Start of new distance measurement packet1 0x80 Address of LMS to reset (In our case we had only

one LMS at an address of 0x00)2 0x02 Low byte3 0x00 High byte (Represents the number of

remaining bytes in packet, not including checksum)4 0x20 Command to change LMS operating mode5 0x24 Request monitoring mode6 0x34 Low byte7 0x08 High byte (Represents the checksum)

1- Start point-distance measurement is the first element in the array (belongs to 0°) andis less than the maximum distance.-distance measurement is less than the maximum distance and no startpoint has been found.-last distance measurement (belongs to 180°) cannot be the start of anobstacle.

2- End point-first distance measurement (belongs to 0°) cannot be the end of anobstacle.-if start point has already been located and the next element in the array isgreater or less than some set threshold, indicating either a new obstaclewill start on the next element or the current element is the end of the

Page 67: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

57

current obstacle. Note that the start point and end point can be the sameindicated the obstacle was seen by only one line scan.

3- Belongs to current obstacle-element belongs to current obstacle if a start point has been found but thecurrent element cannot be considered an end point.

4- Does not belong to an obstacle-element doesn’t belong to an obstacle if it is greater or equal to themaximum allowable distance.

Once the start point and end point have been determined then the two points are

transformed from polar coordinates to Cartesian coordinates using the position of the

LMS as an offset into the real world. Then the start and end points are stored in an

obstacle array and the start and end points are found for a new obstacle in the same line

scan. Fig. 5.3 shows an example of a distance array and how obstacles are determined.

To effectively illustrate the methodology Fig. 5.3 shows what an LMS might measure

from 0° to 180° with a resolution of 5°.

Once the distance array has been examined element for element and all the

obstacles in that line scan have been found then some filtering of the obstacle is done.

The reason for filtering is to discard any obstacle that the LMS reported that may be false

or not actually present. One way to filter is to calculate the distance away from the LMS

that an obstacle can be for a given number of line scans. For example, the minimum

distance away from the LMS for an obstacle that crosses 8 line scans is calculated. If an

obstacle crosses only 8 line scans and the distance from the LMS at both the start and end

points is less than the minimum distance then the obstacle is discarded. Once an obstacle

has been determined to be legitimate then the distance is examined. If the start point and

end point have a distance closer than some safe distance then the vehicle must stop so as

Page 68: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

58

LMS Distance Readings from 0° to 180 ° Every 5 °

Start PointEnd Point

Start Point

End Point

MaximumDistance

Start Pointand End Point

Start Point

End Point

Start Point

End Point

Obstacle #1

Obstacle #2

Obstacle #3

Obstacle #4

Obstacle #5

Fig. 5.3: LMS obstacle determination.

not to strike the obstacle. If no obstacle stops the vehicle then all the obstacles from the

current line scan are copied to the obstacle filter.

5.3 Obstacle Filter

The purpose of the obstacle filter is to filter out (ignore) obstacles that the laser

detects which are not in the GPS field boundaries, obstacles that are already known in the

field, and spurious data that the laser detects (such as dust). If the obstacle is not filtered

out, it is an unknown obstacle and the tractor must avoid it if the obstacle is in its path.

The obstacle filter is made up of four maps the size of the field. Which are all

represented as bitmaps with cells. The first map is the known obstacle map. The known

obstacle map is a map of the field and all the known obstacles in the field which is set up

Page 69: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

59

when a map is loaded on the GUI. The dimensions of the map and all known obstacles

in the map are reported to the Sensor Node and then to the obstacle filter. The boundaries

of the map are placed in the known obstacle map as known obstacles for the purpose of

obstacle avoidance. If the boundaries are known obstacles, the avoidance algorithm will

not drive the tractor out of the field in order to avoid an obstacle close to a boundary.

The second map is the obstacle count map used to keep track of how many times

an obstacle (an obstacle that is not known) has been seen by the laser. Each time an

obstacle is detected, each cell that it occupies in the obstacle count map is incremented by

one. An obstacle is not considered an unknown obstacle until at least one cell that it

occupies in the obstacle count map is greater than a set number. The greater the number,

the more times the obstacle has to be seen to be considered an unknown obstacle. In our

implementation, the obstacle has to be seen twice in order to be considered as unknown.

This is used to filter out spurious data that the laser detects such as dust and for the

purposes of dynamic obstacles.

The third map, the certainty grid map, is used to determine how likely it is that an

obstacle is located at a certain place in the field. The greater the certainty value of a cell,

the more likely an obstacle is located in the corresponding place in the field. All known

obstacles are placed into the certainty grid as the maximum certainty value when a map is

loaded on the GUI. As the laser detects obstacles and they are determined to be

unknown, the corresponding cells in the certainty grid are incremented by some certainty

increment number. All cells that do not contain an unknown obstacle in one laser scan

are then decremented by a certainty decrement number (if the cells do not already have a

Page 70: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

60

certainty value of 0). The decrementing of the certainty cells is used primarily for

dynamic obstacles. As the dynamic obstacle moves in the field, the laser detects it in a

different place every scan. It is incremented in one cell the first scan and another cell the

second scan and so on. If the cells are not decremented when nothing is detected, then it

will look as if the dynamic obstacle is always there instead of moving across the field.

The certainty grid is used heavily in the obstacle avoidance algorithm in determining

what direction to head in order to avoid obstacles.

The final map is the temporary map. The temporary map is just what the name

implies. It is a temporary map used simply to check if an unknown obstacle intersects a

path segment and used in determining which cells contain obstacles in a laser scan that

should be incremented in the certainty grid. This map is not stored for any long period of

time. With each laser scan the temporary map is continually being cleared and redrawn

for various calculations and simple comparisons.

5.3.1 Obstacle Filter Flow of Control

To illustrate the flow control of the obstacle filter we have Fig. 5.4. Each function in

the code that is used in the obstacle filter is described below.

1. mapClear(): initializes all maps used by the filter (known obstacle map, obstaclecount map, certainty grid, and a temporary map used for determining unknownobstacles). Draws the boundaries of the field in the known obstacle map forobstacle avoidance purposes.

2. mapPoly(): immediately after the maps have been initialized, all known obstaclesare drawn in to the known obstacle map as well as placed in to the certainty gridwith the maximum certainty value.

Page 71: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

61

LoadmaponGUI

Initializeobstaclefilter maps

Filterobstaclesinonelaser scan

Updatecertaintygrid

Report unknownobstaclesto

Planner

Obstacleknownorout of gpsboundary

Filteroutobstacle

Unknownobstacle Checkobstacle

against path

Obstacletooclosetotractor StopVehicle

Obstacleintersectspath

Beginobstacleavoidance

Fig. 5.4: Obstacle filter flow diagram.

3. filterObstacle(): As soon as the tractor begins the mission, filter obstacle is calledwith every laser scan.

i. For every obstacle in one laser scan, first it is determined if the obstacle isout of the GPS field boundaries. If it is, the obstacle is ignored.

ii. If the obstacle is inside the GPS boundaries, check to see if it is a knownobstacle. If it is a known obstacle, filter it out (ignore it) because pathplanner has already planned a path around it.

iii. If it is not a known obstacle, increment the cells that it resides in theobstacle count map. If it has been seen more than once (the count is 1 or

Page 72: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

62

more in at least one of the cells that it resides in the obstacle countmap), flag it as an unknown obstacle.

iv. Draw the unknown obstacle in to the temporary map along with thecurrent path segment. If the path intersects the obstacle, determine thedistance from the current tractor position to the obstacle.

v. If the tractor is a distance less than or equal to a given stopping distancefrom the obstacle, the tractor is halted because it is too close to theobstacle.

vi. If the tractor is greater than the stopping distance but less than the givenmaximum distance to an obstacle, start the obstacle controller thread tobegin the avoidance algorithm.

vii. Check the next path segments for intersection of unknown obstacle ifneeded. If there is an intersection, v and vi are repeated. Once allunknown obstacles in one laser scan have been determined, increment allthe cells that contain obstacles and reside in the certainty grid by the setincrement value. Any cell that is not occupied by an unknown obstacle inone laser scan is decremented in the certainty grid by the set decrementvalue. Decrementing of the certainty grid is done to take in to accountdynamic obstacles.

viii. Send all unknown obstacles to the path planner to be relayed back to theGUI for display purposes.

5.3.2 Dynamic Obstacles Overview

In order to take into account dynamic obstacles, the obstacle filter has to perform

a few specialized tasks. A dynamic obstacle is different from the regular unknown

obstacles in that it is not detected by the laser at the same location in the field every laser

scan. As the tractor is moving, the obstacle is also moving, or may move at anytime. So

why do these obstacles have to be handled differently? First, let’s consider a bale of hay

in the field that will obviously never move throughout the tractor’s mission. The bale of

hay is considered to be an unknown obstacle because it is not a permanent obstacle in the

Page 73: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

63

field that will always be there. The farmer may remove it from the field in between

missions. As the tractor gets within the laser detection range of the hay bale, it will be

detected with each laser scan and reported to the obstacle filter. Because the bale of hay

is in the same location in the field all the time, the corresponding cells in the certainty

grid will be incremented with each laser scan until the maximum certainty value is

reached. The higher the certainty value of a cell, the more likely an obstacle appears at

the corresponding location in the field. As the tractor passes the bale of hay and the laser

no longer detects it, the cells it occupies in the certainty grid are no longer incremented.

When the tractor turns around and comes from the other direction, it will detect the bale

of hay again and increment the certainty grid.

This approach works great for unknown obstacles in the field that never move,

however, if the obstacle is moving, things have to be handled differently. If a deer

happens to walk through the field, the cells in the certainty grid that it occupies on a

particular laser scan cannot simply be incremented. The certainty grid would then

incorrectly show the deer as a long line of the path that the deer is taking across the field.

If the tractor started to avoid an obstacle and came close to the path that the deer had

taken, it would avoid the path even though the deer is nowhere to be seen. In order to

represent the possibility of dynamic obstacles in the field, when the certainty grid is

updated, not only are occupied cells incremented, but unoccupied cells are decremented

as well. Therefore, as the deer walks across the field, the laser detects him at one location

in the field and updates the corresponding certainty cells, but in the second laser scan (if

the deer is moving fast enough) the deer will be detected in a different location on the

Page 74: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

64

field. The new location will be incremented in the certainty grid at the same time as

the old location is being decremented. This creates a fading effect of obstacles. The

locations where the deer used to be in the certainty grid will eventually fade back to 0

because the deer is no longer there. If the tractor begins to avoid an obstacle close to the

path that the deer took across the field, it will no longer begin avoiding the path because

it has been faded away and is no longer considered to be an obstacle that needs to be

avoided.

Another feature that aids in dynamic obstacles is the obstacle count map. In order

to be considered an unknown obstacle, an obstacle must have a count of one or more in

its corresponding cells (the obstacle has been detected by the laser more than once). If a

deer sprints through the field and is detected only once at each location, the deer will not

even be considered as an obstacle, it will be filtered out. However, if the deer is moving

at a slow enough pace to be detected by the laser in the same location more than once,

then it is an unknown obstacle and will be avoided if necessary.

5.4 Avoidance

Once the obstacle filter has determined that an obstacle blocks the desired path of

travel the vehicle must exit the path, pass the obstacle, and return to the path as quickly as

possible. This is accomplished by using a modified vector histogram algorithm.

At this point we need to make a distinction between the modified vector

histogram algorithm that we used and that algorithm used by Borenstein [22-27]. In

Borenstein’s algorithm the autonomous vehicle was assumed to be able to travel in any

direction, with steering radius limitations. This means that if the vehicle determined the

Page 75: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

65

safest route of travel was to turn a full 180° and retrace its path for a time then the

algorithm allowed for this. To determine steering direction Borenstein examined all the

possible driving directions looking for a valley of driving directions that his vehicle could

pass though. Our implementation differs in these two points. First, a farm tractor follows

a set path every time; too much deviation from that path is undesirable so our algorithm

limited the number of directions that the vehicle could travel. Instead of steering

directions every 5° we limited the number of possible steering directions to eight, as will

be described in further detail below. This allowed us to construct a masked polar

histogram using only the eight possible directions of travel. The second difference with

our algorithm is that we constrained the vehicle so that at certain times in the avoidance

process only certain driving directions were possible. If these driving directions were

blocked then the tractor stopped. In Borenstein’s algorithm any possible driving

direction could be taken so long as there was a big enough valley, meaning that the

vehicle could wander until it was able to reattach itself to the desired path. We did not

want this. If the algorithm decided to avoid to the right of an obstacle and later the right

side became blocked then we wanted the vehicle to stop, not try and attempt another way

around the obstacle.

Now that we have discussed the differences between the two algorithms we will

discuss our avoidance algorithm in greater detail. The algorithm is given several possible

headings for the vehicle to travel. As mentioned, our implementation specified eight

possible directions of travel. Those eight directions are:

1- Path heading (desired heading)2- +35° of path heading

Page 76: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

66

3- -35° of path heading4- Current vehicle heading5- +25 of path heading6- -25 of path heading7- +45 of path heading8- -45 of path heading

The algorithm then must decide which of eight directions to take so that the vehicle exits

the path, travels along side the obstacle, and returns to the path.

Before any determination of which direction to take, the algorithm must know if

any of the eight headings are blocked, meaning that the vehicle cannot travel in that

direction. To determine the blocked headings, the certainty grid discussed in the obstacle

filter section is used. Because the obstacle filter represents the entire field of travel for

the vehicle, only a partial examination of the certainty grid is necessary. An active

window is a grid that follows the vehicle and overlaps only a portion of the certainty grid,

specifically the immediate vicinity of the vehicle. The active window is a square window

that does not change orientation and always represents the vehicle at its center grid

position. An example of an active window and its overlap into the certainty grid is given

in Fig. 5.5. Fig. 5.5 shows a map of 30x30 and an active window of 15x15.

The first step in determining if any of the 8 directions are blocked is to construct a

binary histogram of the eight directions. A binary histogram indicates which of the eight

directions are blocked or unblocked. The first step in constructing the binary histogram

is to treat each cell in the active window as a vector from the center of the window. The

direction vector βij is determined by the following, where i and j represent the rows and

columns of the active widow.

Page 77: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

67

−−

= −

oi

joij xx

yy1tanβ

where:

xo,yo: Center coordinates of the active window.

xi,yj: Coordinates of the active cell.

This matrix can be precompiled and the values should be in radians, for

computation reasons. An example of this matrix, for a 15x15 active window, is shown in

Fig. 5.6 with the values computed in degrees for visualization (for calculations the values

should be in radians).

29 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 028 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 027 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 026 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 025 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 024 0 0 0 0 0 0 0 0 0 0 0 0 0 0 50 50 50 50 0 0 0 0 0 50 50 0 0 0 0 023 0 0 0 0 0 0 0 0 0 0 0 0 0 0 50 50 50 50 0 0 0 0 0 50 50 0 0 0 0 022 0 0 0 0 0 0 0 0 0 0 0 0 0 0 50 50 50 50 0 0 0 0 0 50 50 0 0 0 0 0

7 21 0 0 0 0 0 0 0 0 0 0 0 0 0 0 50 50 50 50 0 0 0 0 0 0 0 0 0 0 0 06 20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 05 19 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 04 18 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 03 17 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02 16 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01 15 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 00 14 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0-1 13 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0-2 12 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0-3 11 0 0 0 0 0 50 50 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0-4 10 0 0 0 0 50 50 50 50 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0-5 9 0 0 0 50 50 50 50 50 50 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0-6 8 0 0 0 50 50 50 50 50 50 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0-7 7 0 0 0 0 50 50 50 50 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

6 0 0 0 0 0 50 50 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 05 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 04 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 03 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 02 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 01 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29-7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7

Certainty Grid with Active Window Overlay Representation

Offset into active window

Vehicle center point

Fig. 5.5: Certainty grid with active window overlay.

Page 78: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

68

Along with a vector direction matrix a vector magnitude matrix is required. This

matrix will allow us to compute the magnitude of each cell in the active window. Cells

that are further away from the vehicle center point are weighted less than cells closer to

the vehicle. The formula for the vector magnitude is given by:

( )22ijijij dacm −=

where:

cij: Certainty value of active cell.

dij: Distance from active cell to center of active window.

and the value a is chosen according to:

7 135 131 126 120 113 106 98 90 82 74 67 60 54 49 456 139 135 130 124 117 108 99 90 81 72 63 56 50 45 415 144 140 135 129 121 112 101 90 79 68 59 51 45 40 364 150 146 141 135 127 117 104 90 76 63 53 45 39 34 303 157 153 149 143 135 124 108 90 72 56 45 37 31 27 232 164 162 158 153 146 135 117 90 63 45 34 27 22 18 161 172 171 169 166 162 153 135 90 45 27 18 14 11 9.5 8.10 180 180 180 180 180 180 180 0 0 0 0 0 0 0 0-1 188 189 191 194 198 207 225 270 315 333 342 346 349 351 352-2 196 198 202 207 214 225 243 270 297 315 326 333 338 342 344-3 203 207 211 217 225 236 252 270 288 304 315 323 329 333 337-4 210 214 219 225 233 243 256 270 284 297 307 315 321 326 330-5 216 220 225 231 239 248 259 270 281 292 301 309 315 320 324-6 221 225 230 236 243 252 261 270 279 288 297 304 310 315 319-7 225 229 234 240 247 254 262 270 278 286 293 300 306 311 315

-7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7

Vector Direction Matrix

Fig. 5.6: Precompiled vector direction matrix.

Page 79: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

69

12

12

=

− sWa

where:

Ws: Size of active window.

The a - dij2 portion of the vector magnitude matrix can be precompiled for each cell in the

active window. One stipulation to the formula is that if a - dij2 is less than 1 then the

value is set to 0. The full magnitude will have to be computed once the vehicle is moving

in its world and certainty values are known. An example of the precompiled vector

magnitude matrix is given in Fig. 5.7.

With these two precompiled matrices the binary polar histogram can be

constructed. For each of the eight desired directions k, the polar obstacle density is

calculated by:

7 0 0 0 0 0 0 0 1 0 0 0 0 0 0 06 0 0 0 0 5 10 13 14 13 10 5 0 0 0 05 0 0 0 9 16 21 24 25 24 21 16 9 0 0 04 0 0 9 18 25 30 33 34 33 30 25 18 9 0 03 0 5 16 25 32 37 40 41 40 37 32 25 16 5 02 0 10 21 30 37 42 45 46 45 42 37 30 21 10 01 0 13 24 33 40 45 48 49 48 45 40 33 24 13 00 1 14 25 34 41 46 49 50 49 46 41 34 25 14 1-1 0 13 24 33 40 45 48 49 48 45 40 33 24 13 0-2 0 10 21 30 37 42 45 46 45 42 37 30 21 10 0-3 0 5 16 25 32 37 40 41 40 37 32 25 16 5 0-4 0 0 9 18 25 30 33 34 33 30 25 18 9 0 0-5 0 0 0 9 16 21 24 25 24 21 16 9 0 0 0-6 0 0 0 0 5 10 13 14 13 10 5 0 0 0 0-7 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0

7 6 5 4 3 2 1 0 -1 -2 -3 -4 -5 -6 -7

Vector Magnitude Matrix

Fig. 5.7: Precompiled vector magnitude matrix.

Page 80: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

70

∑∈

=owActiveWindji

ijijp

k hmH,

'*

where:

1' =ijh if [ ]αβαβ +−∈ ijijk ,

0' =ijh otherwise

and α represents some set growth to the obstacles. In our case α is 5°. What all this

means is that for each of the eight desired directions the active window is examined along

with the certainty cells in the certainty grid that the active window overlaps. For

example, if one of the desired directions of travel was 90° then every cell in the active

window is examined. If a cell has a certainty value and that cell’s vector direction is 90°

+/- 5° then the certainty value is squared and multiplied by the corresponding cell in the

vector magnitude matrix and the value is summed. The total number of active window

cells that have a magnitude are summed and if that sum is greater than some threshold τ

then that direction is considered blocked and receives a value of one in the binary polar

histogram. If the sum of all the magnitude cells that lie within the desired direction is

less than τ then the direction is considered not blocked and the binary polar histogram

assigns that direction a zero. Doing this will compute the binary polar histogram for all

the desired directions. Our implementation computes two binary polar histograms, the

first, as mentioned above, is the histogram for the 8 desired directions of travel. This

histogram is used to determine the possible directions of travel that are unblocked and

those that are blocked. The second histogram is computed the same way but computes

direction vectors from 0° to 360° every 5°. This histogram is only referred to when

Page 81: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

71

determining if the vehicle has passed the obstacle being avoided and does not influence

the direction the vehicle will travel in. This histogram also uses a smaller active window,

with the active window size being the distance to maintain between the tractor and the

obstacle.

With the two histograms having been computed the vehicle has enough

information about its environment to avoid obstacles. The approach that was taken to

avoid obstacles was kept as simple as possible using an idea from the classic Boy Scout

orienteering problem. If a scout starts out on a hike and wants to get to the base of a

large mountain he will travel a straight path at a constant heading. If a lake blocks his

path and forces him to change his desired heading he will choose a new heading +/- 90°

of the desired heading. Suppose for this example he chooses to turn to the right 90°. As

he travels the new heading he will always check his desired heading, or to his left, to see

if he has passed the lake. Once he has passed the lake he will again travel the desired

heading. Now the lake will be on his left and he will travel the desired heading until the

lake is no longer on his left. At that point he will adjust his heading to 90° to the right so

he can walk back to the original path he would have taken if the lake were not present.

When he reaches the desired path he will turn right 90° and walk at his desired heading

until he reaches the camp. Fig. 5.8 shows this concept. This concept was used to

develop our avoidance algorithm. Instead of turning +/- 90° of the desired heading the

deviation from the path is specified to be +/- 35° of the desired heading, and when the

tractor passes the obstacle the return path is specified to be +/- 25° of the desired heading.

Page 82: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

72

Start Point

Desired PathActual Path of Travel

Goal

Fig. 5.8: Classic boy scout orienteering problem.

A shallower return path to the desired path was used to help the controller attach the

tractor back to the path more smoothly.

To successfully avoid an obstacle four states are specified. The four states are the

exit state, straight state, return state, and stop state. In the exit state the direction the

tractor needs to take to get off the desired path is determined. The two possible heading

directions will be either +/- 35° of the desired heading (heading of the path). When the

heading to exit the path has been determined the tractor is considered to be in the straight

state. In the straight state the vehicle travels at either +/- 35° until the desired heading

(path heading) is no longer blocked in the binary polar histogram. After the desired

heading is unblocked the vehicle drives to the desired heading and is considered to be in

the return state. In the return state a determination is made of when the vehicle, including

any implement being used, has passed the obstacle. To determine when the vehicle has

passed the obstacle the algorithm examines +/- 25° of the desired heading (the one that

Page 83: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

73

will take the vehicle back to the desired path) in the binary polar histogram to make

sure the direction is open. The algorithm also examines the second binary polar

histogram making sure that the obstacle does not appear on the side that the tractor

desires to travel. Once the algorithm has determined the tractor has passed the obstacle

then the tractor is steered back to the desired path and the vehicle is considered to be in

the stop state. In the stop state the distance from the tractor to the desired path is

calculated. When this distance is within a set threshold then the obstacle avoidance

algorithm stops and the Master Node takes over driving the vehicle and the tractor has

successfully avoided an obstacle. An example of each of the states is given in Figs. 5.9-

5.12.

After the avoidance algorithm has determined the direction the tractor needs to

travel to avoid obstacles the task is to steer the tractor to that heading and track to that

heading. To control the tractor steering while avoiding, a motion exception drive vector

must be communicated to the Master Node. This informs the Master Node to stop using

the path tracker for control of the vehicle steering because new drive information is being

communicated from the motion exception. To control the tractor steering while avoiding

a simple P controller in the sensor node is used. Fig. 5.13 shows the block diagram of the

steering controller.

In this section we discussed the details of the obstacle detection and avoidance

systems that was used on the tractor. Specifically we discussed the flow of the obstacle

detection and avoidance system. We then addressed how the LMS laser range finder

reports obstacle data and how that obstacle data in interpreted into obstacle locations

Page 84: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

74

relative to the vehicle position. We then examined how the obstacle data from the

LMS and from the path planner are combined in the obstacle filter. Finally, we detailed

the avoidance algorithm that was used on the tractor.

Pond

Barrel

Tractor

50’

Fig. 5.9: Exit state.

Fig. 5.10: Straight state.

Page 85: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

75

Fig. 5.11: Return state.

Fig. 5.12: Stop state.

Page 86: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

76

B l o c k D i a g r a m o f S t e e r i n g C o n t r o l D u r i n g A v o i d a n c e

Fig. 5.13: Block diagram of steering control during avoidance.

Page 87: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

77

Chapter 6

Simulations and Vehicle Tests

The simulation and testing of the obstacle detection and avoidance system took

place over the course of the project. This section details the evolution of the avoidance

algorithm from the initial Matlab simulations to the final version that was utilized on the

tractor.

At the start of the project we researched various obstacle avoidance algorithms.

As Chapter 4 explains, we made a list of five required criteria for the avoidance algorithm

to merge with our design architecture. From that list we decided that the VFH+

algorithm from Borenstein [21] would be the model we would use for our algorithm. To

prove that the VFH+ algorithm was a good method of obstacle avoidance we wrote

several Matlab functions that simulated a vehicle moving in a world map while it

encountered obstacles. We then plotted the masked polar histogram (Sub-section 4.3.5.3)

at various vehicle positions in the world. We also plotted the vehicle moving around

these obstacles due to the VFH+ algorithm. Fig. 6.1 shows the plots for the VFH+

simulation with the following simulation parameters.

map size 80x80 ft angle resolution 5°

cell size 1x1 ft rr+s 6 ft

active window size 31x31 ft turning radius 5 ft

vehicle size 2x2 ft smax 20 sectors

vehicle speed 5 ft/sec

Page 88: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

78

Fig. 6.1: Map of VFH+ simulation.

Page 89: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

79

Fig. 6.1: Map of VFH+ simulation (cont.).

Page 90: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

80

With the Matlab simulations proving that the algorithm did work as expected

we then proceeded to write the algorithm in C++ on our tractor computers. After this was

done we demonstrated that the obstacle avoidance algorithm worked while the tractor

was on blocks. In this demonstration we simulated obstacles being detected in front of

the vehicle and the avoidance algorithm making vehicle-heading corrections to avoid

collisions and get the tractor on the desired path as quickly as possible.

After the on-blocks demonstration came two months of field-testing. It was once

we started testing in the field that we saw the first signs of instability in the VFH+

algorithm. The instability occurs because the number of choices the VFH+ algorithm

must make. The VFH+ algorithm must choose between 0° to 360° every 5°. That gives

72 different heading choices to make. For an omni-directional robot the number of

choices would be fine, but for a tractor that needs to veer off course at most +/- 20°, 72

choices is too many. From that point the algorithm evolved into what is presented in

Chapter 5. This algorithm proved to be very effective in every situation where it was

tested.

Page 91: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

81

Chapter 7

Conclusions and Recommendations

In this thesis we have discussed obstacle detection and avoidance for an autonomous

farm tractor. We initiated the discussion by examining the motivation for the work. We

then presented the tractor project as a whole, detailing how each of the subsystems

worked together and then describing each subsystem separately. We next turned our

attention to obstacle detection technology, describing the different sensors used in

literature as well as the sensor selected to be used for the project. We then discussed

obstacle avoidance. We highlighted the differences between global and local avoidance

and we specified five criteria for an avoidance algorithm that would merge into our

design architecture. We then discussed different avoidance algorithm and considered if

they could be merged into our design. The details of the algorithm that fit our design

were then presented. We then discussed the obstacle detection and avoidance system that

was used on the tractor. We highlighted the differences between our avoidance algorithm

and Borenstein’s algorithm. Finally we presented the simulation and testing procedures

that helped us refine our avoidance algorithm.

Now that all is said and done, what have we learned and where can this work lead

us? In retrospect the avoidance algorithm we ended up using was very simple, yet the

path to simplicity is often steep and treacherous. Many times as engineers we tend to

complicate matters when the simplest answer is often the best. We experienced this more

than once on the project. Another lesson learned was that man in an incredible machine.

We take for granted the powerful sensors our eyes truly are and we hope that one day

Page 92: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

82

some genius will be able to construct a sensor as effective as the eye. The detection

sensor is truly the limiting factor in this whole process. No matter how clever the whole

system is, without the knowledge of the surrounding environment we cannot operate

safely in any farming environment. Fortunately there are clever people who are trying to

overcome this obstacle with the technology of today. By using a combination of different

detection sensors and fusing the data a solution may be possible.

Where will this work lead in the future? As was demonstrated in this thesis, the

author examined many different obstacle avoidance techniques and then designed a

unique approach based loosely on other people’s work. The author’s avoidance

algorithm, while different than all other avoidance algorithms, is specific to the farming

environment, specifically slow moving vehicles that do deviate much from the desired

path of travel because of possible crop damage. In order to improve on this algorithm

more testing in the field needs to be done. More testing would allow us to try different

exit angles from the path and different return angles to the path, finding the optimal

solution. More testing would also demonstrate the robustness of the algorithm. The

obstacle filter, which was a novel approach to combining global and local obstacle

avoidance, can be improved to try and take out spurious data. Some improvements could

be pattern recognition; examining the filter map and determining if a cluster of cells is

dust (spurious data) or an actual obstacle. This avoidance algorithm could also be used in

other autonomous vehicle research, such as the omni-directional vehicles being

developed by Utah State University. The algorithm would have to be adapted to the

specific robot function, but the basis for the algorithm stays the same. There are many

Page 93: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

83

applications that this obstacle detection and avoidance approach could be used, the task

is for those going after us to improve upon our foundation.

Page 94: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

84

References

[1] K. Moore and M. Torrie, “John Deere Final Report: Obstacle Detection andAvoidance for an Automated Tractor,” Tech. Rep., Center for Self-Organizingand Intelligent Systems, Utah State University, Aug. 1999.

[2] N. Flann, K. Saunders, and L. Pells, “Cooperative mission planning andexecution,” Proceedings of the SPIE Conference on Robotics and Semi-RoboticGround Vehicle Technology, pp. 147-154, 1998.

[3] L. Navarro-Serment, C. Paredis, and P. Khosla, “A beacon system for thelocalization of distributed robotic teams,” Proceedings of the InternationalConference on Field and Service Robotics, pp. 232-237, 1999.

[4] T. Bailey, E. Nebot, J. Rosenblatt, and H Durrant-Whyte, “Robust distinctiveplace recognition for topological maps,” Proceedings of the InternationalConference on Field and Service Robotics, pp. 347-352, 1999.

[5] N. Harper, and P. McKerrow, “Detecting plants for landmarks with ultrasonicsensing,” Proceedings of the International Conference on Field and ServiceRobotics, pp. 144-149, 1999.

[6] A. Howard, L. Kitchen, “Cooperative localization and mapping,” Proceedings ofthe International Conference on Field and Service Robotics, pp. 92-97, 1999.

[7] Y. Matsumoto, K. Ikeda, M. Inaba, and H. Inoue, “Exploration and mapacquisition for view-based navigation in corridor environment,” Proceedings ofthe International Conference on Field and Service Robotics, pp. 341-346, 1999.

[8] J. Chahl, M. Srinivasan, “Panoramic vision system for imaging, ranging andnavigation in three dimensions,” Proceedings of the International Conference onField and Service Robotics, pp. 127-132, 1999.

[9] R. Bischoff, “Advances in the development of the humanoid service robotHERMES,” Proceedings of the International Conference on Field and ServiceRobotics, pp. 156-161, 1999.

[10] D. Apostolopoulos, M. Wagner, W. Whittaker, “Technology and fielddemonstration results in the robotic search for Antarctic meteorites,” Proceedingsof the International Conference on Field and Service Robotics, pp. 185-190,1999.

[11] T. Oomichi, N. Kawauchi, and Y. Fuke, “Hierarchy control system for vehiclenavigation based on information of sensor fusion perception depending on

Page 95: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

85

measuring distance layer,” Proceedings of the International Conference onField and Service Robotics, pp. 197-201, 1999.

[12] R. Chatila, G. Andrade, S. Lacroix, and A. Mallet, “Motion control for a planetaryrover,” Proceedings of the International Conference on Field and ServiceRobotics, pp. 381-388, 1999.

[13] A. Soto, M. Saptharishi, A. Ollennu, J. Dolan, and P. Khosla, “Cyber-ATVS:dynamic and distributed reconnaissance and surveillance using all terrain UGVS,”Proceedings of the International Conference on Field and Service Robotics, pp.329-334, 1999.

[14] D. Langer, M. Mettenleiter, F. Hartl, and C. Frohlich, “Imaging laser radar for 3-D surveying and cad modelling of real world environments,” Proceedings of theInternational Conference on Field and Service Robotics, pp. 13-18, 1999.

[15] A. Foessel, S. Chheda, and D. Apostolopoulos, “Short-range millimeter-waveradar perception in a polar environment,” Proceedings of the InternationalConference on Field and Service Robotics, pp. 133-138, 1999.

[16] P. Corke, G. Winstanley, J. Roberts, E. Duff, and P. Sikka, “Robotics for themining industry: opportunities and current research,” Proceedings of theInternational Conference on Field and Service Robotics, pp. 208-219, 1999.

[17] E. Prassler, J. Scholz, and P. Fiorini, “Maid: A robotic wheelchair roaming in arailway station,” Proceedings of the International Conference on Field andService Robotics, pp. 31-36, 1999.

[18] S. Thrun, M. Bennewitz, W. Burgard, A. Cremers, F. Dellaert, D. Fox, D. Hahnel,G. Lakemeyer, C. Rosenberg, N. Roy, J. Schulte, and W. Steiner, “Experienceswith two deployed interactive tour-guide robots,” Proceedings of theInternational Conference on Field and Service Robotics, pp. 37-42, 1999.

[19] R. Meier, T. Fong, C. Thorpe, and C. Baur, “A sensor fusion based user interfacefor vehicle teleoperation,” Proceedings of the International Conference on Fieldand Service Robotics, pp. 244-249, 1999.

[20] M. Torrie, S. Veeramachaneni, B. Abbott, “Laser-based obstacle detection andavoidance system,” Proceedings of the SPIE Conference on Robotics and Semi-Robotic Ground Vehicle Technology, pp. 2-7, 1998.

[21] I. Ulrich and J. Borenstein, “VFH+: Reliable obstacle avoidance for fast mobilerobots,” Proceedings of the 1998 IEEE Conference on Robotics and Automation,pp. 1572-1577, 1998.

Page 96: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

86

[22] J. Borenstein and Y. Koren, “The vector field histogram – fast obstacle avoidancefor mobile robots,” IEEE Journal of Robotics and Automation, vol. 7, no. 3, pp.278-288, June 1991.

[23] J. Borenstein and Y. Koren, “Histogramic in-motion mapping for mobile robotobstacle avoidance,” IEEE Journal of Robotics and Automation, vol. 7, no. 4, pp.535-539, 1999.

[24] J. Borenstein and Y. Koren, “Real-time obstacle avoidance for fast mobile robotsin cluttered environments,” Proceedings of the IEEE Conference on Robotics andAutomation, pp. 572-577, 1990.

[25] J. Borenstein and Y. Koren, “Real-time avoidance for fast mobile robots,” IEEETransactions on System, Man, and Cybernetics, vol. 19, no. 5, pp. 1179-1187,Sept./Oct. 89.

[26] B. Holt and J. Borenstein, “OmniNav: Obstacle avoidance for large, non-circular,omni directional mobile robots,” Robotics and Manufacturing, vol. 6, pp. 311-317, May 1996.

[27] S. Shoval and J. Borenstein, “Mobile robot obstacle avoidance in a computerizedtravel aid for the blind,” Proceedings of the 1994 IEEE Robotics and AutomationConference, pp. 2023-2029, 1994.

[28] J. Borenstein and Y. Koren, “Obstacle avoidance with ultrasonic sensors,” IEEEJournal of Robotics and Automation, vol. 4, no. 2, pp. 213-218, April 1998.

[29] I. Kamon and E. Rivlin, “Sensory-based motion planning with global proofs,”IEEE Transactions on Robotics and Automation, vol. 13, no. 6, pp. 814-822, Dec.1997.

[30] A. Fujimori, P. Nikiforuk, and M. Gupta, “Adaptive navigation of mobile robotswith obstacle avoidance,” IEEE Transactions on Robotics and Automation, vol.13, no. 4, pp. 596-602, Aug. 1997.

[31] I. Schwatz, “PRIMUS An autonomous driving robot,” Proceedings of SPIEUnmanned Ground Vehicle Technology, pp. 150-159, 1999.

[32] J. Chuang, and N. Ahuja, “An analytically tractable potential field model of freespace and its application in obstacle avoidance,” IEEE Transactions on Systems,Man, and Cybernetics –Part B: Cybernetics, vol. 28, no. 5, pp. 729-735, Oct.1998.

Page 97: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

87

Appendix A

Page 98: OBSTACLE DETECTION AND AVOIDANCE Keith W. Gray DETECTION-AVOIDANCE... · autonomous ground vehicles for use in farming applications. CSOIS has targeted obstacle detection and avoidance

88

LMS 220 Specifications