Citation
Rule-based feature identification for indoor topological mapping using ultrasonic sensors

Material Information

Title:
Rule-based feature identification for indoor topological mapping using ultrasonic sensors
Creator:
Wetherbie, John Osber
Place of Publication:
Denver, CO
Publisher:
University of Colorado Denver
Publication Date:
Language:
English
Physical Description:
xiii, 243 leaves : illustrations ; 28 cm

Subjects

Subjects / Keywords:
Mappings (Mathematics) ( lcsh )
Connections (Mathematics) ( lcsh )
Robots ( lcsh )
Navigation ( lcsh )
Connections (Mathematics) ( fast )
Mappings (Mathematics) ( fast )
Navigation ( fast )
Robots ( fast )
Genre:
bibliography ( marcgt )
theses ( marcgt )
non-fiction ( marcgt )

Notes

Bibliography:
Includes bibliographical references (leaves 241-243).
Thesis:
Computer science
General Note:
Department of Computer Science and Engineering
Statement of Responsibility:
by John Osber Wetherbie III.

Record Information

Source Institution:
|University of Colorado Denver
Holding Location:
|Auraria Library
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
42613912 ( OCLC )
ocm42613912
Classification:
LD1190.E52 1999m .W48 ( lcc )

Downloads

This item has the following downloads:


Full Text
RULE-BASED FEATURE IDENTIFICATION FOR
INDOOR TOPOLOGICAL MAPPING USING
ULTRASONIC SENSORS
by
John Osber Wetherbie HI
B.S., University of California at Los Angeles, 1984
A thesis submitted to the
University of Colorado at Denver
in partial fulfillment
of the requirements for the degree of
Master of Science
Computer Science
1999


This thesis for the Master of Science
degree by
John Osber Wetherbie HI
has been approved
by
Date


Wetherbie HI, John Osber (M.S., Computer Science)
Rule-Based Feature Identification for Indoor Topological Mapping Using Ultrasonic Sensors
Thesis directed by Assistant Professor Christopher E. Smith
ABSTRACT
The goal of this study is to develop a method for identifying indoor features such as
corridors, alcoves, and T junctions suitable for use in mapping, localization, and navigation.
This study differs from the majority of work in the area of feature identification due to the
focus on detecting large-scale features, e.g., corridors, alcoves, etc., instead of small-scale
features such as comers, edges, and planes. A set of feature patterns are used to represent the
real world features to be identified. Each feature pattern contained a set of rules that are
matched against sonar observations taken by a mobile robot. Sonar readings are taken at
specified distance intervals by the robot and matched against the rule sets of the various
patterns. The rules are very simple and involve comparing a sonar range reading to one or
two threshold values. This simplicity leads to reduced computational complexity and to a
more flexible and extensible feature identification algorithm. The pattern which has the
highest percentage of its rules satisfied by the sonar data is selected as the current feature.
Tests are conducted to identify single features alone and a series of features that would
normally be seen by a robot traversing an indoor environment. Results demonstrate that
in


the proposed methodology successfully identifies all features in the feature set.
Misidentifications can occur at the transitions between features in some cases. These events
are reduced by using filtered data for feature identification instead of raw sonar data.
This abstract accurately represents the contents of the candidates thesis. I recommend its
publication.
IV


DEDICATION
This thesis is dedicated to my wife Michele, our two sons, Andrew and Ryan, and to
the loving memory of my mother.


ACKNOWLEDGEMENTS
I would like to thank several people for their contributions to this thesis effort. My
thanks to my advisor, Chris Smith, for his help in coming up with the topic for this research
and his continued guidance during the time it took to complete this work. My greatest thanks
goes to my family. First to my wife Michele, who made sure I had the evenings and
weekends I needed to complete this research. Next to Andrew, who was very good about
letting Daddy use the computer when he needed to study. Finally to Ryan, who mostly played
quietly and did not put my thesis in his mouth once.


CONTENTS
Chapter
1. Introduction.....................................................................1
1.1 Background......................................................................1
1.2 Problem Statement..............................................................4
1.3 Overview.......................................................................4
2. Related Work....................................................................5
2.1 Overview.......................................................................5
2.2 Grid-Based Representations.....................................................7
2.3 Topological Representations...................................................14
2.4 Feature Identification........................................................20
3. Approach.......................................................................28
3.1 Overview.......................................................................28
3.2 Ultrasonic Sensors............................................................29
3.2.1 Theory.......................................................................29
3.2.2 Polaroid 600 Ultrasonic Sensor...............................................32
3.3 Robot.........................................................................34
3.4 Software......................................................................36
3.4.1 Feature Set..........:......................................................39
3.4.2 Rules........................................................................46
vii


4. Results
51
4.1 Overview........................................................................51
4.2 Results of Individual Feature Identification....................................51
4.2.1 Corridor..................................................................... 52
4.2.2 Four Way Intersection.........................................................54
4.2.3 Up T Intersection.............................................................58
4.2.4 Across T Intersection........................................................60
4.2.5 L Intersection................................................................63
4.2.6 Alcove........................................................................66
4.2.7 Dual Alcove.....:.............................................................68
4.2.8 Corridor End............................................,.....................70
4.2.9 Alcove End......................:.......i.....................................72
4.2.10 Dual Alcove End..............................................................74
4.3 Results of Multiple Feature Identification......................................76
4.4 Summary of Results..............................................................83
5. Conclusions......................................................................88
5.1 Evaluation......................................................................88
5.2 Future Work....................................................................89
5.3 Summary.........................................................................91
viii


Appendix
A. Computer Program............................................................92
B. Program Output and Configuration File......................................221
References................................................................... 241
IX


FIGURES
Figure
1.1 Map to Nearest Copier............................................................1
2.1 A Simple Grid-Based Representation..............................................7
2.2 Basic Sensor Model..............................................................9
2.3 A Simple Topological Map.......................................................15
2.4 Staggered Hallway...............................:.............................19
2.5 Sonar Transmitter and Receiver................................................21
2.6 Angle of Incidence for Planes and Comers......................................22
3.1. Sonar Transducer Lobes....................:...................................30
3.2 Cross Section of a Sonar Beam..................................................31
3.3 Sonar Ring of the Scout Robot................................................ 33
3.4 Arrangement of the Sonar Sensors............................................. 33
3.5 Nomadic Technologies Scout Robot...............................................35
3.6 System Configuration...........................................................36
3.7 Flow of the C++ Program........................................................38
3.8 Indoor Feature Set.............................................................40
3.9 Predicted Sonar Data for a Corridor............................................41
3.10 Predicted Sonar Data for a Four Way Intersection..............................41
3.11 Predicted Sonar Data for an Up T Intersection.................................42
x


3.12 Predicted Sonar Data for an Across T Intersection..................................42
3.13 Predicted Sonar Data for a L Intersection..........................................43
3.14 Predicted Sonar Data for an Alcove.................................................43
3.15 Predicted Sonar Data for a Dual Alcove.............................................44
3.16 Predicted Sonar Data for a Corridor End............................................44
3.17 Predicted Sonar Data for an Alcove End.............................................45
3.18 Predicted Sonar Data for a Dual Alcove End.........................................45
3.19 match Function from FrontLongRule..................................................50
4.1 Sonar Plot for a Corridor...........................................................52
4.2 Predicted and Actual Sonar Data for a Corridor......................................53
4.3 Sonar Plot for a Four Way Intersection..............................................55
4.4 Predicted and Actual Sonar Data for a Four Way Intersection.........................56
4.5 Sonar Plot for an Up T Intersection.................................................58
4.6 Predicted and Actual Sonar Data for an Up T Intersection............................59
4.7 Sonar Plot for an Across T Intersection.............................................61
4.8 Predicted and Actual Sonar Data for an Across T Intersection........................62
4.9 Sonar Plot for a L Intersection.....................................................63
4.10 Predicted and Actual Sonar Data for a L Intersection...............................64
4.11 Sonar Plot for an Alcove...........................................................66
4.12 Predicted and Actual Sonar Data for an Alcove......................................67
xi


4.13 Sonar Plot for a Dual Alcove...................................................68
4.14 Predicted and Actual Sonar Data for a Dual Alcove..............................69
4.15 Sonar Plot for a Corridor End..................................................70
4.16 Predicted and Actual Sonar Data for a Corridor End.............................71
4.17 Sonar Plot for an Alcove End...................................................72
4.18 Predicted and Actual Sonar Data for an Alcove End..............................73
4.19 Sonar Plot for a Dual Alcove End...............................................74
4.20 Predicted and Actual Sonar Data for a Dual Alcove End..........................75
4.21 Map of Multiple Feature Environment............................................77
4.22 A Portion of the Multiple Feature Environment.....1............................78
4.23 Sonar Plot for Multiple Feature Environment....................................79
4.24 Robot Detecting a Corridor End Instead of an Up T...............................84
4.25 Robot Unable to Differentiate an Up T from a Dual Alcove........................85
4.26 Robot Unable to Differentiate an Alcove End from a L Intersection...............85
4.27 Misidentification of a Four Way as a Dual Alcove................................86
4.28 Misidentification of a Four Way as an Up T......................................86
xii


TABLES
Table
3.1 Rule to Feature Mapping...............................................................49
4.1 Percentage of Correct Feature Identifications.........................................82
4.2 Raw and Averaged Data Test Feature Identification.....................................83
xiii


1. Introduction
1.1 Background
Recall the last time you were asked for directions to a particular office or location in
a building you work in, say where the nearest copier is located. It is likely that the way you
answered went something like this:
Go to the T up ahead and turn right. Go to the end of that corridor where it Ts and turn
left. Go through a corridor intersection and past an alcove on the right. The next room on the
left is the copier room. If you come to another T youve gone too far.
You could also provide this information graphically in the form of a simple map. The
map wouldnt necessarily be to scale but would indicate the features highlighted in the verbal
description, as below:
Figure 1.1: Map to Nearest Copier
1


Both of these methods for providing directions emphasize the features that a person
would see as they proceeded towards the copier. The distance to the location and the time to
get there are secondary or irrelevant in most indoor situations.
Now, the copier in this case seems pretty far away but you would probably feel fairly
certain that the person asking for directions will be able to successfully find the copier
without too much effort. What if it was a robot that was asking for directions? Would a
robot be able to understand concepts like corridor, alcove, and T? Would it be able to
recognize these features when it came upon them?
To perform useful tasks a mobile robot needs a model of the near world [19] that
represents the worlds configuration and the robots location within this environment. Using
a model of its surroundings a robot can avoid obstacles, identify changes in the
environment, and navigate its way to the nearest copying machine. A map is a symbolic
construction and is meant to describe discrete entities, objects or places in the environment
[7]. The model, or map, that a robot maintains must provide the information needed to do its
work properly.
There are two main ways to express the model of the real world for a robot: grid-
based and topological. Grid-based maps are grids and capture, as accuracy allows, the true
layout of an environment. The verbal directions and the simple map presented above are
topological in nature. Dudek [7] states that Most humans naturally conceptualize
navigation information in both symbolic or topological terms as well as quantitative metric
terms, depending on the context, task, and scale.
2


Topological representations emphasize qualitative and relational information (Go
past two corridor intersections and an alcove on the right.) instead of exact measurements.
Janet, et al., [11] look at this as a more intuitive way to represent an environment than
approaches that provide specific distance information.
Dudek [7] does make the point that in many cases a topological map that is created
and used by a robot does not need to match up with the topological map/description that a
human would use to describe an area. A single room to a person could appear as two or more
rooms to a robot.
While this difference in perceiving and describing the environment may not be
critical for many types of robots, think back to the robot looking for the nearest copier.
Unless the robot has a map of the building that contains the features that a person sees, it will
be difficult, if not impossible, for it to find the copier. This does not preclude the robot from
having additional features or information, but it must be able to understand that a room or a
hallway that a person sees is a single entity.
One of the most popular methods for developing a map of a robots surroundings is
to collect sonar data via ultrasonic sensors. This sonar data can then be used to construct
grid-based representations or topological models, as described above, where the data has
been manipulated to identify/construct simple geometric shapes that compose the
environment [5] [15] [17] [26],
Grid-based maps present the environment as areas that are occupied or unoccupied
with a granularity that is defined by the size of the cells that make up the grid. The
topological approaches emphasize the relationships between landmarks or sets of primitives:
3


planes, comers, edges [5] or walls, comers, edges, and cylinders [17] and then attempt to
connect them together in a representation of the robots surroundings.
This research presents a method which will allow a robot to identify large-
scale indoor features such as corridors, alcoves, and T intersections. This approach has the
advantage of using features which have semantic value to humans as the building blocks
(primitives) of the environmental representation as compared to grid-based and small-scale
topological approaches. This will allow a high-level interface to robots which is more in line
with the way humans perceive and interact with the environment while being based on
constructs that robots can successfully identify directly from sensor data.
1.2 Problem Statement
Current research in developing environmental models for mobile robots use grid-
based approaches, topological maps of small-scale features, or a combination of both.
None of these approaches identifies features on the scale of corridors, intersections, etc.,
directly. This thesis develops an approach to identify these large-scale indoor features
directly from observations taken by a mobile robot.
1.3 Overview
The next chapter presents related work in the areas of grid-based maps, topological
representations, and feature identification. Chapter 3 describes the methodology that was
used for this work. Chapter 4 gives the results of the feature identification experiments.
Chapter 5 draws conclusions based on the results of this research and suggests possible areas
for future work.
4


2. Related Work
2.1 Overview
Robots have two primary ways of expressing knowledge about their surroundings:
grid-based representations and topological representations. These maps are used by robots
for obstacle avoidance, navigation and localization; that is, for determining where the robot
is, within an environment [18] [19]. Not surprisingly, much of the work reviewed in this
chapter deals with these topics.
The research presented in this thesis diverges from the majority of this work in one
important aspect. The focus is to develop an approach which will allow the identification of
large-scale features in an indoor environment. Most of the work surveyed deals with the
identification of primitive or small-scale features such as comers and planes, if it addresses
feature recognition at all. Large-scale features could be used to create a representation of the
robots surroundings for navigation and localization. They would also have the benefit of
having a semantic meaning to accompany the representation; that is, a corridor feature not
only looks like a corridor but it is identified as such. A map with these type of features
could be used to help the robot solve the find the nearest copier problem presented in
Chapter 1.
Maps and descriptions using large-scale features are used by humans to provide
meaningful directions all the time. The use of higher-level abstractions/concepts provides a .
simpler and more compact way of providing information. A person will better understand
5


and better use the directions go straight down this corridor to the first alcove on the left
than the directions go thirty six tiles straight ahead, stop, then turn to the left. Both sets of
directions are accurate and provide the means for a person to get to the copier. From that
standpoint the directions are equivalent. Now put yourself in the position of the person
wanting to find the copier. Do you want to count the number of tiles you have walked over
from where you received the directions? What would happen if the distance given was too
short or too long? Blindly following the directions would mean that you would go past the
copier or stop before you reached your destination. Are the sets of directions still equivalent?
It is also possible that the feature-based directions might be incorrect. In this case you are no
worse off than you were with directions that specified an incorrect distance to travel.
In contrast, it might be easier for a robot to use distance traveled than to identify
environmental features to determine if the robot has reached its goal. The robot could be
provided with an a priori map of the environment that specified distances. It could be
commanded to explore the environment and build a metric-based map itself. In either case
the ability of the robot to find the copier is dependent on its localization capability which is
based on the ability to determine distance traveled accurately. Depending on this accuracy is
not necessarily reasonable since localization and navigation are the subjects of many current
research efforts.
Use of a feature-based map will eliminate some, if not all, of these problems. The
use of features will also aid in interacting with humans. A person would be able to provide
directions to and receive directions from a robot in a much more natural way if the
information is topological in nature. Using features which have semantic value to humans
6


and robots as the building blocks for maps will have significant advantages over
representations that may have more precise distance information but are more restrictive.
2.2 Grid-Based Representations
Grid-based approaches use one or more two-dimensional arrays of cells as the basis
for a model of the world. In their simplest form, cells that appear occupied are marked as
such by assigning them a value of 1. Empty cells have a value of 0. Cells that are located in
unexplored territory could be marked as unknown. Figure 2.1 is an example of a simple grid.
- - - X X - - -
- X X X X X X
- X
X X
- X X X X X X
- X X - - -
- X X - - -
- X X - - -
Figure 2.1: A Simple Grid-Based Representation
The grid map above shows what the meeting of two corridors, a L intersection, might
look like. Cells that are occupied are marked with an X. Unknown cells are designated by a
dash in this example. Unoccupied cells are shown as empty.
More sophisticated approaches have probability values associated with each cell.
The approaches commonly used to determine and update the probability values of the cells
include Bayesian statistics, Dempster-Shafer methods, or variations of these two.
For Bayesian reasoning approaches [8] [16] [17] [26] the values are the probability
of the cell being occupied and the probability of the cell being unoccupied. For Dempster-
7


Shafer methods [18] [23] the values are the support for the cell being occupied and the
support for the cell being empty. Howard and Kitchen [10] add a third value relating to the
support for the cell being occupied or empty.
For both probability updating schemes, it is common for each cell in the grid to have
its probabilities initialized to a specific value [10] [17]. For Bayesian approaches a value of
0.5 for the occupied and unoccupied probabilities is commonly used. The value of 0.5
represents the fact that it is equally likely that a cell is occupied as it is unoccupied before
any readings have been taken. For Dempster-Shafer a mass distribution of:
m(R(j>) = 0
m(-iR<|>) = 0
m(R<|> v -iR<|>) = 1 (2.1)
could be used. This indicates no support for the cell being occupied, no support for the cell
being empty, and support for the cell being occupied or empty.
The Bayesian and Dempster-Shafer updating schemes act on data that indicates
whether a cell may be empty or occupied. This data is presented to these algorithms via a
sensor model. The research documented in [8] [17] provides the basic sensor model that is
used, with some modifications, by all grid-based approaches. This sensor model has the
following properties:
1) If a cell is closer than the range indicated by the sonar measurement then the
likelihood of the cell being empty increases.
2) If a cell is farther away than the range indicated by the measurement then the
occupied/empty probabilities dont change. Essentially there is nothing that can
be determined about the cell.
8


3) If a cell is at the same distance as the range reading, the cell may be the cause of
the return and the probability of occupancy is increased. The amount the
probability of occupancy increases is usually inversely proportional to the range
of the reading. This spreads the probability that one particular cell in an arc of
cells is causing the return. This essentially gives more weight to short range
readings than to long range readings.
Thus a grid map could have areas that are probably empty, probably occupied, and
unknown. Figure 2.2 shows an example of this sonar model.
The Histogramic In-Motion Mapping (HIMM) approach [2] describes a
simplification of the above model. HIMM only uses a single probability value known as the
Certainty Value with a range of 0 to 15. More importantly, only the cells along the acoustic
axis of the sensor are updated by a measurement versus the entire beam-width of the sensor.
The one cell that has its Certainty Value incremented is the cell at the measured range of the
reading. The empty cells between the sensor and the cell at the measured range have their
Certainty Values decremented. When a Certainty Value is incremented, its value is increased
by three but when the value is reduced the decrement is one. These values for incrementing
and decrementing the Certainty Values were arrived at based on experimentation.
9


An interesting problem that arises with the use of sonar sensors in mapping indoor
environments is that most surfaces in an indoor environment are very good acoustic mirrors
[3]. If a surface is at an angle to the sensor then the echo return could be partially or
completely reflected away from the sensor. This could make the surface appear farther away
or even invisible to a ultrasonic sensor [3] [10]. The sound pulse could also bounce off
multiple surfaces before being received by the sonar sensor. This would also yield a reading
that would indicate a surface is farther away than it is in actuality [3] [10] [16].
The approaches outlined in [2] [8] [17] took a simple approach to handling the
specular qualities of indoor environments by rejecting range readings above a certain
maximum value. This limit was meant to eliminate the problems caused by specular
reflection based on the assumption that range readings caused by specular reflection occur at
the maximum range of a'sonar sensor.
Howard and Kitchen [10] use a modified occupancy grid called a response grid to
build a model of an indoor environment. A response grid is a two-dimensional array of cells
like an occupancy/certainty grid. The modification attempts to provide a more realistic
handling of specular reflection to build a more accurate occupancy grid.
The basic premise of the response grid is that a cell can appear to be occupied when
viewed from one direction but will appear empty (that is, not generate a response) from
another. A smooth surface only returns an echo to a sensor when the angle of incidence
between the surface and the sonar beam is near zero. At larger angles of incidence the
surface will return a weak response or none at all. Thus the same feature at the same location
would appear differently based on the location of the sonar sensor. In other grid-based
10


approaches this would lead to a contradiction since they assume that an occupied cell would
appear occupied from all directions.
Each cell in the grid can be empty or contain one or more surfaces that will reflect
ultrasonic pulses. Cell occupancy is determined by assuming that a cell that has an echo
return from it, in one or more directions, must have a surface in it and is therefore occupied.
This method leaves open the possibility that a cell might have surfaces in it that do not
reflect in any direction. Howard and Kitchen state that the conditions that could cause this
are rare in indoor environments.
Whether a cell is occupied or not is indicated by the state variable Occ that can be
set to one of two values:
Occ(x,y) = [occupied, unoccupied] (2.2)
The response of a cell for a direction <|> is maintained by the state variable Res that
can be set to one of two values:
Res(x,y, Thus a cell that has one or more responses associated with it will be marked as
occupied with some amount of probability based upon the update methodology used.
Lim and Cho [16] also address the problem of specular reflection and how it can
affect the accuracy of a grid-based map. This paper introduced the concept of specular
reflection probability and used it to modify the Bayesian updating function for the grid. The
specular reflection probability is composed of two components; one is the Range Confidence
Factor (RCF) and the other is the orientation probability.
11


As noted above, specular reflection can cause misleading range information which
indicates an obstacle is farther away or that it is not there. As with the approaches in [2] [8]
[17] this will lead to interpreting range readings toward the maximum range of the sensor as
specular reflection. In contrast, instead of using a limit that throws away the data, [16]
applies specular reflection probability to reduce the confidence of the reading. The Range
Confidence Factor is used to reduce the confidence in the range of long range readings. This
will reduce the amount that a cells probability of occupancy will be incremented if that cell
is at a long range from the sensor.
The orientation probability indicates that the surface within a cell is oriented with a
certain probability. This probability is determined by collecting data at different locations for
the cell. In this sense it is somewhat like summing up all the Res state variables from [10] to
determine the orientation of the surface. The orientation probability is used to affect the
confidence of a range reading. Remember that the incidence angle of the sensor beam is also
a major parameter for specular reflection. If a cell appears to return a response but the
orientation probability indicates that the surface orientation is such that the angle of
incidence is high then the probability of occupancy should not be incremented as much as if
the incidence angle was low.
Standard sonar sensors have very poor angular resolution because of their wide
beam widths [2] [6] [8] [10] [16] [17] [22], This makes it is necessary to combine range
readings for a cell from a number of different positions to determine the probability of
whether a cell is occupied or unoccupied.
12


As an illustration of how this works assume that one reading indicates that there is an
object along an arc at a particular range. A second reading that indicates empty space
intersects the first arc. The area of intersection could be considered have a increased
probability of being empty or that the surface orientation of the cell is such that it provided a
better return from one position than another. This information will cause the probabilities of
occupancy and emptiness to be updated for the cells in the two sensor sweeps. As more data
is accumulated for a cell the probabilities will approach the real state of the cell.
Grid-based approaches suffer from space and time complexity [11] [26], especially
as the size of the environment represented grows. Moravec [17] mentions that the grid
representation was somewhat reluctantly adopted at Carnegie Mellons Mobile Robot
Laboratory. This reluctance may have been related to the amount of memory required to use
a grid-based map.
[12] and [19] address the space and computational complexities of grid-based
approaches by using tree structures to reduce the amount of data needed to represent an
environment. These tree structures are called Quadtrees and Octrees based on the maximum
number of children, four and eight, respectively, that a node in the tree may possess.
Quadtrees are commonly used to represent two-dimensional areas while Octrees are normally
applied to three-dimensional spaces [12],
The root node of the tree represents the entire grid map and the subsequent
generations of child nodes recursively partition the map into smaller and smaller areas.
Nodes can represent empty space, occupied space, or partially occupied space. Empty and
occupied nodes have no children while partially occupied nodes have the maximum number
13


of children (4 or 8) for the type of tree. In this way a single node can be substituted for a
large number of grid elements and reduce the memory required for the representation [12].
To summarize, grid-based approaches are a popular way to represent the
environment that a robot is traversing. This is in spite of the specular reflection and angular
resolution problems associated with the sonar sensors that are commonly used to provide
input to these maps. Ultrasonic sensors are attractive for use because of simplicity, low cost,
and distance measurements that are provided directly [8]. These distance measurements can
be directly placed into a grid-based map or modified to take into account uncertainties due to
specular reflection and angular resolution. Grid-based approaches suffer from space and time
complexity as the environment represented grows larger. Tree structures have been used to
reduce the memory required by grid maps.
2.3 Topological Representations
A topological map provides a high-level description of an environment that is
dependent on the structure and features detectable in that environment [19]. A topological
map could be a simple list of features [19] but more commonly the underlying
implementation is a graph [4] [7] [26]. The nodes in the graph represent interesting locations
or features found in the environment. These interesting locations, such as a room or a
corridor intersection, act as recognizable landmarks. The arcs of the graph can represent the
path between distinct places, like a hallway or a doorway, or the transitions between features.
Figure 2.3 shows a simple topological map with arcs representing hallways and numbers
indicating interesting areas, such as offices, intersections, etc., that were identified in the
exploration of an area.
14


7
1 --------- 2 ------------- 6
4 3 8 ---------- 9 --------- 12
Figure 2.3: A Simple Topological Map
Topological models usually provide a more compact representation of the
environment than grid-based approaches [7] [26]. They are also somewhat more intuitive
than grid-based approaches in the sense that they recognize distinct or recognizable areas in
the way a person might [7] [11]. Topological maps express the environment in terms of
landmarks, such as those mentioned to allow a robot to find the nearest copier machine from
Chapter 1.
Topological maps can be constructed directly by accumulating nodes and arcs as a
robot navigates through an environment or by translating another representation into a
topological map. Other representations that have been translated into topological maps
include grid-based maps [8] [26] and clusters of points that were assembled by a neural net
[11].
Elfes [8] approach uses a hierarchy of maps. The grid-based maps are the first level,
called the sensor level. The second level is the geometric level, created by identifying groups
15


of cells with high probabilities of occupancy and labeling them as unique entities, such as
desks and chairs. The third level is the symbolic level. The maps at this level are topological
in nature and contain information about larger areas than the maps on the lower two levels.
Nodes may represent interesting areas where additional information is provided or they could
represent corridors.
Thrun and Bucken [26] also build a topological map of the environment based on a
grid-based representation. To accomplish this the free space of the grid map is partitioned
into a number of regions separated by critical lines. Critical lines correspond to narrow
passages such as doorways. The partitioned map is then converted into a graph. The regions
are mapped to nodes and the critical lines, that is, the paths between the regions, become
arcs.
Pure topological approaches often have difficulty telling two different locations that
look alike apart, especially if they have been reached by different routes. This is because
topological approaches primarily use the robots position relative to landmarks to determine
where they are in the environment. The methods described in [8] and [26] use grid-based and
topological maps in combination to distinguish different locations that look alike
topologically. Accessing the position data available from a grid map makes differentiating
topologically similar locations easier.
Janet, et al., [11] use the combination of a neural network and hyper-ellipsoid
clustering to create a topological map. Hyper-Ellipsoid Clustering (HEC) groups sets of
points obtained from sonar data into elliptical areas. The elliptical areas filter out outlying
data points. By using a hill-climbing algorithm the data presented by the HEC Kohonen
16


method can be used for location recognition. The approach presented does identify large-
scale features like corridors as areas of open space but the primary focus is on identifying
and utilizing collections of line segments as landmarks.
Chong and Kleeman [5] use a grid map and a topological map in concert. The grid
map is used for obstacle avoidance while the feature-based map is used for localization.
Features are classified as planes, comers, edges, or unknown. Unknown features are not
added to the topological map. One interesting note about the grid-based map is that the cells
have occupancy and distance information associated with them instead of probabilities of
occupancy and emptiness.
Bulata and Devy [4] use a hierarchy of models to build a topological representation
of an environment, similar to [8] and [26], but do not use grid-based, maps. Another
difference is that the robot in this research uses a laser range finder instead of sonar sensors
to acquire information about its environment. The basic building block in this approach is a
landmark. Landmarks are defined as a set of line segments and include doorways, with and
without doors, portions of corridors, and geometric features found in rooms and corridors.
Only useful [4] landmarks in the data are extracted and used to construct the geometric
map.
The geometrical model contains information about landmarks. The symbolic model
groups landmarks into areas. An area corresponds to a complete entity such as a room or a
corridor. The topological model is a map composed of areas which define the entire known
environment of the robot.
17


Dudek [7] uses an approach that is very similar to that presented in [26], Grid maps
of local areas are merged together into a topological representation. The topological map is
composed of metric regions that have position and distance data collected for them, non-
metric regions that serve as parts of a topological link between metric regions, and unknown
regions that have not been visited by the robot. An office or other interesting location
would be an example of a metric region. A hallway could be a non-metric area.
Line segments are used to model collections of observations of the environment.
Each segment can be thought of as representing a section of a wall or other obstacle. Line
segments that are close together and parallel or near parallel are merged together.
Large-scale features could be generated from the grid maps but are not because this
would require domain specific assumptions about the environment. Dudek [7] cites the
example of an office building which would require rules and constraints such as typical
offices having rectilinear orientations and hallways and doors having standard widths.
It is interesting to note that the research presented in this thesis uses rules very much
like those mentioned in [7] for identifying large-scale features. However, this work doesnt
depend on the types of restrictive assumptions that Dudek feels are necessary for performing
feature identification. Also in contrast to Dudek, there was no concept of applying this
methodology outside of an office building environment.
Kunz, Willeke, and Nourbakhsh [14] present a method that constructs a topological
map mainly based on the movements of the robot, not on direct usage of sensor data. This
paper assumes that the environment the robot is in is rectilinear and that hallways have a
particular width for a specific building.
18


One assumption that was made is that sequential hallway intersections are at least six
feet apart. This is supposed to help differentiate between changes in one hallway and a new
hallway. Thus a staggered intersection is not allowed or would not be recognized. Figure
2.4 shows this situation.
Less than 6 feet
Figure 2.4: Staggered Hallway
The map that the robot produces is a graph that represents the topology of the office
building the robot is in. Nodes represent intersections and hallway transitions. Nodes contain
location information about location and whether adjacent nodes had been visited. The arcs of
the graph have distance information associated with them.
The approach in [14] does identify large-scale features. These features fall into three
categories: intersections, hallways, and open areas. An open area is defined as a path through
a physical open area and is differentiated from a hallway based on the expected width of a
hallway. This work has more in common with this thesis that any other paper surveyed. It
19


describes identifying large-scale features and using these features to understand and
represent the environment.
Topological maps provide a high-level representation of an environment and are
usually more compact than grid-based maps modeling the same environment. These models
are often used for localization but not for navigation since they do not explicitly represent
free space [19].
Topological-based approaches have the concept of identifying features but usually
on a smaller scale than in this research. One approach [14] does identify hallways, open
areas, and intersections but only as artifacts of constructing a topological map. Various
methodologies for identifying features are presented in the next section.
2.4 Feature Identification
Of the papers that discussed grid-based approaches only Moravec [17] described the
possibility of identifying objects directly from the grid representation. Tasks such as tracking
corridors and identifying doors and desks are mentioned but apparently left for future work.
Some topological approaches [4] [11] [14] dealt implicitly or explicitly with assembling
and/or identifying large-scale indoor features such as corridors or rooms. Most approaches,
however, deal with the identification of small-scale features. Methodologies for identifying
features are presented in this section.
Barshan and Kuc [1] present a method for distinguishing a planar feature, such as a
section of wall, from a comer. They describe, an intelligent sensor that uses multiple
ultrasonic transducers to detect differences in amplitudes and travel times of the pulses
emitted by the transducers.
20


The approach in [1] is based on the concept of a virtual receiver. If there are two
sonar sensors pointing at each other, one acting as a transmitter and the other acting as a
receiver, then it is possible to determine the distance between them using the amplitude of
the signal at the receiver and the angle of incidence between the two sensors. Figure 2.5
shows the geometry of a transmitter and a receiver.
Now instead of a transmitter/receiver pair, imagine a single sensor and a feature, a
plane or a comer, that it is facing. Since indoor features are good reflectors of sound they act
as mirrors. Thus, it is possible to place a virtual receiver on the other side of the feature
feature acts as if it was received by the virtual receiver. Planes and comers are differentiated
by the sign of the angle of incidence of the received signal. Figure 2.6 shows this
arrangement for planes and comers.
Transmitter
Receiver
Figure 2.5: Sonar Transmitter and Receiver
and calculate the distance to this phantom sensor because the reflected signal from the
21


Figure 2.6: Angle of Incidence for Planes and Comers
Politis and Probert [24] extend the work of [1] by using a Continuous Transmission
Frequency Modulated (CTFM) sonar. This system uses a frequency-based approach to
determine the type of feature. This method allows for the identification of planes, comers,
and edges.
22


The return signal seen by the receiver is mixed with the signal being emitted by the
transmitter. Filtering is performed to produce a signal with only one frequency. This
frequency is directly proportional to the range of the object that reflected the signal. Planes
and comers are differentiated by the sign of the angle of incidence of the signal. Edges only
return a fraction of the signal transmitted since most of the energy is reflected away from the
sensors.
Using this approach experiments were run to map the boundaries of a room. This
was successfully accomplished but no attempt was made to classify the room as an object or
feature itself.
Kleeman and Kuc [13] also extend the work presented in [1]. Similar to [24] this
approach is able to distinguish between planes, comers, and edges. The concept of an
unknown reflector type is also added. This paper identifies edges to include highly curved
surfaces along with convex comers.
This method does not use amplitude measurements to determine range as in [1]. This
is because the method in [1] will not provide accurate characterization of edge features. By
using two transmitters and two receivers the difference in angles of the signal bearings will
indicate the type of feature. A plane will have a positive difference, a comer will have a
negative difference, and an edge will have a value of zero. Time of flight information is used
in distinguishing edges from planes and comers. If a feature has a low confidence level
associated with it, it is classified as unknown.
Ohya, Nagashima, and Yuta [22] present work that is very similar to [1] and [13]. In
this case the primary purpose is to identify walls. This work highlights the fact that it is
23


possible to find long walls but it is difficult to find small walls, or indentations in walls, that
can only be seen from a very small area within a room or corridor.
Horst [9] presents an algorithm to convert a certainty grid representation into object
boundary curves. The object boundary curves are represented as piece-wise linear segments.
These boundary curves could then be used to detect higher-level features. It should be noted,
however, that the higher-level features in the context of this paper are comers, curves, and
lines.
The methodology developed by Chong and Kleeman [5] identifies partial planes and
comers and then uses a Julier-Uhlmann Kalman Filter (JUKF) to merge these elements into
a map of the environment. Newly identified planes are only merged with the partial plane
that is adjacent to them to prevent doorways from being closed off erroneously.
A partial plane is-described by its state parameters, the coordinates of its
approximate endpoints, and endpoint status. Endpoint status indicates whether an end
terminates with another plane to form a comer. When a wall is first detected it appears as a
partial plane with only one endpoint. A comer is characterized by its coordinates only. The
technique used in this paper [5] is unable to determine if a comer is concave or convex.
To eliminate false targets, the positions from where the robot took observations are
maintained. When the map is sufficiently complete filtering is performed by checking the
line of sight from the observation point to the false target. If a partial plane blocks the line of
sight and the false image is far enough away from the plane, the false target can be removed.
Localization is performed and then the feature fusion is done using the estimated
robot position. Removal of redundant features is also performed. That is, if two partial planes
24


are adjacent and co-linear, they will be merged into one feature. In this way features can be
expanded.
Lacroix and Dudek [15] associate the arcs of sonar scans with a set of real world
primitive features. This approach has the robot rotate in place for a number of revolutions to
collect sonar data. This is in contrast to approaches in [1] [5] [13] [22] [24] that identify a
small-scale feature by taking data from a single position or multiple positions and apply a
squared error or Kalman Filter method to the data.
The set of data taken after several rotations of the robot is used to produce a sonar
scan. The arcs that are a part of one of these scans are called Regions of Constant Depth
(RCDs) because they indicate an object is located at a constant radius along the arc. RCDs
are matched against a set of primitive features. The primitive features include Wall, Comer,
Edge, Cylinder, Cluttered Area, and Multi-Bounce RCD. A Cluttered Area is an object such
as a chair, shelves, etc., that generates RCDs but is not identifiable as a Wall, Comer, etc.
Multi-bounce RCDs are caused by multiple reflection echo due to the specular nature of the
environment.
Features are differentiated based on the angular width of the Region of Constant
Depth and a priori Bayesian probability density functions. An RCD corresponds to one of
the primitives in the above set. A hypothesis is created that represents a correspondence
between one RCD and one primitive. It is possible that a single RCD could potentially be a
number of different features, so a hypothesis would be created for each of these possibilities.
Each hypothesis would have an associated probability of existence.
25


As the robot moves through the environment scans are taken to produce RCDs from
different positions. RCDs developed at different positions are matched to find the Regions of
Constant Depth that correspond to the same primitive feature. Matching can also be used to
differentiate Wall RCDs from Comer and Edge RCDs since RCD orientation remains the
same for Walls but changes for other features.
Features are linked together by applying two rules to the current set of identified
features. One rule states that each Wall is ended by an edge, a comer, or a cluttered area. The
second rule is that each Edge or Comer is supported by two walls that are normal or parallel.
Walls, Comers, and Edges are complete when their supporting primitives are identified.
Work has also been done in the area of feature identification for Autonomous
Underwater Vehicles (AUVs) [25]. The methodology in this paper makes use of a Kalman
Filter to project feature hypotheses into the future. Hypotheses are associated with the type
of feature that may be being detected. This implies that there could be multiple hypotheses
for one detected feature. Hypotheses are pruned by additional measurements until only one
hypotheses is left.
As an example, a measurement might indicate that a feature is a plane or a curve.
The algorithm assumes the feature is a plane but subsequent predictions are generated both
for the feature being a plane and for it being a curve. Once a hypotheses has been satisfied
the incorrect branch can be pruned. Hypothesis trees can also be pruned when a set number
of hypothesis branches is reached or a time limit is reached.
While the authors of [25] state that distinctive features can be identified and
mapped there is no specific mention of the type of features to be detected. This is somewhat
26


surprising, especially in light of the fact that the paper represents its approach as a means of
encoding a priori knowledge of the environment. Based on the example of trying to identify
a plane versus a curve it would be reasonable that the granularity being explored is
equivalent to land-based studies that identify comers, walls, etc.
In summary, there has been a great deal of work done in the area of feature
identification and in the closely related topic of map representation for a mobile robot. The
motivation for this research was to improve robot navigation, localization, obstacle
avoidance, or a combination of these. This thesis differs from the majority of this work based
on the goal of identifying large-scale features such as corridors and intersections instead of
comers and partial sections of walls. One paper [14] did identify large-scale indoor features
explicitly to construct a topological map of an office building. While the number and type of
features identified was much simpler in [14] than in this research, it provides confirmation
that the identification of complete features is a reasonable approach to describing an indoor
office environment.
27


3. Approach
This chapter covers the tools that were used in the performance of this work.
Information regarding ultrasonic sensors, the robot platform, large-scale features in an
indoor environment, and an overview of the C++ software program is presented.
3.1 Overview
This research utilizes a mobile robot with a suite of ultrasonic sensors to detect the
features of an indoor environment. In defining this research a number of decisions were made
regarding the implementation. These decisions are listed below.
1) Make use of an off-the-shelf robot with simple sensors versus a specially made
robot and/or sensor suite. This would demonstrate that non-specialized robots
can perform productive work.
2) Use a simple algorithm for the identification of environmental features.
Specifically, the sonar data obtained by the robot would be used directly in
identifying features with a minimum of filtering and manipulation.
3) Identify complete or large-scale features in the environment as opposed to
components of these features such as walls and comers.
To perform its task, the robot is instructed to move a specified distance down a
corridor. Along the way the robot collects sonar data and processes it. When a feature is
detected the robot informs the user of this event.
28


The following sections present information about the ultrasonic sensors used, the
robot, and the software written to control the sonar data collection and feature identification
tasks.
3.2 Ultrasonic Sensors
3.2.1 Theory
Ultrasonic or sonar (SOund Navigation And Ranging) sensors are commonly used on
mobile robots that are meant to operate in an indoor environment [6], The information
provided to a user by the sensor is a single number representing the distance measurement
from the sonar transducer to the closest obstacle to it.
The ultrasonic sensors used in this research are time-of-flight sensors, i.e., these
sensors measure the distance to an object by emitting an energy signal, in this case sound
energy, listening for the echo, and dividing the time between the original signal and the
return echo by two [21].
R C(Techo Tsigna^/2 (3.1)
where R is the range or radial distance from the sensor to the closest object and c is the speed
of sound. The value of c is:
c = 331.4(T/273)1/2 m/sec (3.2)
where T is the temperature in degrees Kelvin.
Most sonar transmitters, like Radio Frequency (RF) transmitters, do not emit their
energy in an infinitely narrow beam or uniformly in all directions, but rather in a set of
29


shaped beams, or lobes. The transducers are designed such that the majority of the energy is
contained in the main lobe. Figure 3.1 (from [21]) illustrates this point.
Figure 3.1: Sonar Transducer Lobes
As the sonar pulse moves farther away from the transmitter the area covered by the
main lobe increases. Thus, the return that a sonar sensor receives is no more and no less than
an indication that there is something at a radial distance R along an arc that is bounded by the
beam width of the sonar pulse at distance R. This uncertainty causes great difficulty in
determining the exact location of an object [6] [8] [10] [16] [22], It should be noted that this
uncertainty in the actual location of an object will decrease as the sensor gets closer to the
object. Figure 3.2 shows the cross section of a sonar beam.
30


Figure 3.2: Cross Section of a Sonar Beam
Sonar sensors are sensitive to changes in temperature and humidity since these affect
the speed of sound in air [6] [8] [20]. As temperature and humidity increase the amount of
attenuation an ultrasonic signal will experience increases also. For a frequency of 50 kHz,
attenuation can range from 0.6 to 1.8 dB/meter for temperature ranges of 17 to 28 C and
relative humidity ranges of 15 to 70 percent [20], This should be less of a problem in an
indoor environment where these factors can be controlled by air conditioning. Indoor
environments should provide range readings that will be consistent over time.
The surface characteristics of objects in the environment and their orientation to the
main beam of the sonar will also affect the range readings of an ultrasonic sensor. Most
interior surfaces act as mirrors for sound waves and if the surface is at an angle to the sonar
beam, the return echo will be reflected away from the sensor. This will cause the object to
appear to be farther away or to not be detected at all [3] [6] [10]. Brown [3] describes the
problem as follows:
Using an ultrasonic sensor to look at arbitrary objects in a room is rather like
standing in a room completely filled with mirrored objects and having only a penlight glued
to your forehead as a source of light: specularity abounds and many surfaces are not visible.
31


The reflectivity or absorption characteristics of surfaces also can change the signal
intensity of the echo received by the transducer. The differences in signal intensity can
influence the rise time of the return echo in the sensor and cause more reflective objects to
appear closer[6].
3.2.2 Polaroid 600 Ultrasonic Sensor
The Polaroid 600 ultrasonic sensor uses its transducer as both a transmitter and as a
receiver. At the start of a ranging cycle a train of 16 transmit pulses at 49.4 kHz is
transmitted by the transducer. To eliminate false readings caused by the transducer ringing
the receiver circuitry is disabled for a brief period. Following the blanking period, the
transducer acts as a receiver [20] [21].
The Polaroid 600 sensor can measure distances from 6 inches to 35 feet (420 inches)
with an accuracy of +/- 1 percent over the range. The system used on the Scout robot restricts
the range from 17 inches to 255 inches. The beam width of the sensor is 25 [20] [21],
The sonar configuration of the Nomadic Technologies Scout robot has a ring of 16
Polaroid sensors spaced at 22.5 around the circumference. Figure 3.3 is a picture of the
sonar ring. Figure 3.4 shows the arrangement of the sonar sensors in the sonar ring.
32


Figure 3.3: Sonar Ring of the Scout Robot
13
12
11
Figure 3.4: Arrangement of the Sonar Sensors
33


3.3 Robot
A Nomadic Technologies Scout model was used in this research. An Application
Programmers Interface (API), written in the C programming language, is provided with the
robot and provides the means with which to control the robots movements and to collect
data from the sonar ring. Figure 3.5 is a picture of the robot.
The robot can be controlled by user supplied software that runs in the Linux
operating system environment. This software can run on a laptop computer that is attached to
the Scout robot directly or on a workstation that communicates with the Scout via Radio-
Ethernet. This research used the workstation/Radio-Ethemet configuration for controlling the
robot. Figure 3.6 provides a diagram of this configuration.
34


Figure 3.5: Nomadic Technologies Scout Robot
35


Linux Workstation
Figure 3.6: System Configuration
3.4 Software
The movement and data collection of the robot and the feature identification
algorithm are implemented in a C++ software program. This program was developed using
the GNU g++ compiler and the Nomadic Technologies API. The program runs in the Linux
OS environment.
The C++ program begins by initializing the robot in preparation for movement and
the collection and processing of sonar data. The robot is commanded to traverse a corridor
and collect sonar data at approximately fixed distance intervals (0.5 inches). The raw sonar
data sets are collected into groups of specified size (one or six) and are averaged together.
The averaged sonar data is compared to a set of feature patterns which represent the various
36


indoor environmental features. The averaged sonar data is matched against rules, i.e., what
the sonar data should look like, that uniquely identify one feature from another. As features
are detected position data and the sonar data used for identification are associated with the
feature. The detected features are logged and a report of the data collected is generated when
the robot reaches the distance it was commanded to travel. For a flow chart of the program,
see Figure 3.7. Refer to Appendix A for the program itself.
The features mentioned above are complete features found in an interior
environment such as corridors, alcoves, etc. This Feature Set encompasses knowledge of the
environment that the robot will be operating in. If the robot detects a feature that does not
satisfy any of the available rules then the feature is identified as unknown. While there is no
a priori map that the robot can refer to, there is a priori knowledge about the type of
environment it will be operating in [25]. Of course, it would be possible to add new features
to the Feature Set to handle alternative environments. The Feature Set is presented in the
following section.
37


Figure 3.7: Flow of the C++ Program
38


3.4.1 Feature Set
This work is based on the identification of large-scale features in an indoor
environment. As opposed to being represented as occupied and unoccupied cells in a grid-
based map or as a collection of planes, comers, etc., the features identified represent
complete entities or components of the environment. Figure 3.8 is the set of features that
were expected to be seen in the environment in which the robot was operating.
The features in this set are geometrically simple, angular, and rectilinear, as can be
seen in Figure 3.8. A rectilinear environment implies that the angles where features meet are
90 or 180. The feature identification approach of this thesis was only tested in a rectilinear
environment. This is because the test environment, the North Classroom building of the
Auraria Campus, is composed of these shapes. Brief descriptions of the members of the
feature set are below.
Corridor A feature that appears long and narrow, i.e., the distances in front and
back of the robot are significantly greater than those to the sides of the robot. For this
work an aspect ratio of 3 (distance front and back > 3 x distance to the sides) was used
(Figure 3.8a). This definition of a corridor eliminates the assumption of a specific corridor
width. Figure 3.9 presents a graph of the predicted sonar data for this feature. The graph
shows the expected range reading for each of the sonar sensors in the robots sensor ring
starting at sonar 0 and wrapping around the ring back to sonar 0.
39


n'r
j L
n r
a) Corridor b) Four Way Intersection c) Up T Intersection
~i r
d) Across T Intersection
e) L Intersection
L j L
r h r
g) Dual Alcove h) Corridor End
i) Alcove End
j) Dual Alcove End
Figure 3.8: Indoor Feature Set
40


Figure 3.9: Predicted Sonar Data for a Corridor
Four Way An intersection of two corridors where both corridors continue on for some
distance. Around the center of this feature distances to the front, back, and sides of the
robot are long (Figure 3.8b). Figure 3.10 shows the predicted sonar data for this feature.
Sonar Number
Figure 3.10: Predicted Sonar Data for a Four Way Intersection
T Intersection An intersection of two corridors where one of the corridors ends. For this
research two features represent this concept: Up T and Across T. An Up T is detected by
41


the robot when it is moving up the lower part of the T (Figure 3.8c). An Across T is the
situation where the robot is moving across the top of the T (Figure 3.8d). Predicted sonar
data for going up a T intersection and across a T intersection are shown in Figures 3.11 and
3.12, respectively.
Sonar Number
Figure 3.11: Predicted Sonar Data for an Up T Intersection
Sonar Number
Figure 3.12: Predicted Sonar Data for an Across T Intersection
42


L Intersection An intersection of two corridors where both corridors end. The robot
would detect a long distance to the rear and to one side (Figure 3.8e). Predicted sonar data
for this feature is shown in Figure 3.13. This graph assumes that the robot is entering the L
such that one corridor is behind the robot and the other is to the right of the robot.
Sonar Number
Figure 3.13: Predicted Sonar Data for a L Intersection
Alcove A feature where one side of a corridor is wider than the normal width of the
corridor for some distance (Figure 3.8f). Figure 3.14 presents the predicted sonar data for
this feature.
Sonar Number
Figure 3.14: Predicted Sonar Data for an Alcove
43


Dual Alcove Similar to an alcove except that both sides of the corridor are wider than
normal (Figure 3.8g). Predicted sonar data for this type of feature is shown in Figure 3.15.
Sonar Number
Figure 3.15: Predicted Sonar Data for a Dual Alcove
Corridor End A feature that represents the end of a corridor. It differs from a corridor in
the fact that one end is closed (Figure 3.8h). Figure 3.16 presents predicted sonar data for a
Corridor End.
Sonar Number
Figure 3.16: Predicted Sonar Data for a Corridor End
44


Alcove End A corridor end with one side wider than is normal for the corridor (Figure
3.8i). Figure 3.17 represents the predicted sonar data set for this feature.
Sonar Number
Figure 3.17: Predicted Sonar Data for an Alcove End
Dual Alcove End A corridor end where both sides are wider than normal for the corridor
(Figure 3.8j). A graph of predicted sonar data for a Dual Alcove End is shown in Figure 3.18.
Sonar Number
Figure 3.18: Predicted Sonar Data for a Dual Alcove End
45


As mentioned in the ultrasonic sensor section above, one source of uncertainty in
sonar readings is caused by the beam spreading as it gets farther away from the sensor. The
approach used in this work to detect features actually makes use of this fact. Patterns can be
detected by using sonar readings from different sensors that are taken at the same time and
location. The mechanism for matching sonar data to a feature pattern is by applying a set of
rules that the feature pattern contains to the collected sonar data. The rules of a feature
pattern are used to differentiate it from other feature patterns.
When sonar data is processed it is matched against the rule sets of each feature
pattern. The feature pattern with the highest matching percentage identifies the feature that
the sonar data described. The matching percentage for a feature is determined by dividing the
number of rules satisfied by the total number of rules the feature contains. Thus, if the sonar
data satisfies all the rules for a particular feature it will have a matching percentage of 100.
Matching Percentage = # of rules matched/# of rules (3.3)
If there is more than one feature pattern with the highest matching percentage a
simple tie-breaking algorithm is used. If one of the best matches is a corridor, it will be
identified as the feature. If one of the best matches is a four way intersection it will be
selected unless one of the other candidate feature patterns is a corridor. Otherwise, the first
feature detected with the highest matching percentage is identified as the feature.
3.4.2 Rules
When sonar data is being compared to the various features to determine which
feature the data matches, the range readings are actually being applied to a set of rules that
46


each feature pattern contains. It is how well the sonar observations conform to one of these
rule sets that determines what feature the robot identifies.
Brief descriptions of the rules are presented below. Table 3.1 identifies which rules
are used by which feature patterns.
BackLongRuIe This rule is used to check whether the sonar data indicates there is a
distance greater than or less than the threshold for long distances to the rear of the robot. If
the value is greater than the threshold the rule is true.
CorrRule This rule is used to check whether sonar readings indicate that the aspect ratio
of the forward and backward pointing readings to the left and right facing readings is above
or below a set value. If the aspect ratio is greater than the threshold the rule is true.
FrontLongRule If the reading for the sonar pointing directly forward is greater than
the specified long threshold then there is open area for a long distance in front of the robot.
In this case the rule returns true.
FrontShortRule This rule is used to check for obstacles near the front of the robot. If the
reading for the sonar facing directly forward is less than the specified short threshold then
the rule is true.
SidelnterRuIe The rule is true if there is an object at a distance between the short and
long thresholds to one side of the robot.
47


SideLongRule This rule returns true if either the sonar facing directly to the left or
directly to the right of the robot indicates a long open area.
SideShortRule This rule is true if the range reading for the sonar sensor facing directly
left or the sonar sensor facing directly right is less than a specified threshold.
TwoSidelnterRule This rule returns true if there are obstacles to both sides of the
robot at distances that are between the short and long threshold values.
TwoSideLongRule This rule returns true if the sonar sensors that directly point left and
right indicate that there are long open areas to the sides of the robot.
TwoSideShortRule This rule is true if the readings for both of the side-facing sensors is
less than a specified threshold.
48


Rule Feature
BackLongRule Corridor Four Way Intersection Up T Intersection Across T Intersection L Intersection Alcove Dual Alcove Corridor End Alcove End Dual Alcove End
CorrRule Corridor
FrontLongRule Corridor Four Way Intersection Across T Intersection Alcove Dual Alcove
FrontShortRule Up T Intersection L Intersection Corridor End Alcove End Dual Alcove End
SidelnterRule Alcove Alcove End
SideLongRule L Intersection Across T Intersection
SideShortRule L Intersection Alcove Alcove End
T woSidelnterRule Dual Alcove Dual Alcove End
TwoSideLongRule Four Way Intersection Up T Intersection
TwoSideShortRule Corridor Corridor End
Table 3.1: Rule to Feature Mapping
49


The match function of a rule performs the test to determine whether the sonar data
satisfies the rule or not. Figure 3.19 shows the match function from FrontLongRule. The
complete code for all the rules may be found in Appendix A.
bool FrontLongRule::match(const vector ktheData) const
{
return (getFront(theData) > itsThreshold) ? true : false;
}
Figure 3.19: match Function from FrontLongRule
50


4. Results
4.1 Overview
This chapter covers experiments conducted using the robot and software described in
Chapter 3. This system was tested in several runs at different locations within a building. The
system was able to properly identify the features it traversed. The results of identifying
individual features are presented first. A specific experiment to identify multiple features in a
single run is then discussed. A summary of the results is then presented.
4.2 Results of Individual Feature Identification
A series of experiments was run to ascertain whether the approach described in
Chapter 3 successfully identifies individual features. A typical experiment involved
configuring the distance for the robot to travel and the number of sonar readings to average
together for feature determination. Experiments were conducted using either single readings
or sets of six sonar readings averaged together for feature identification. All these values are
contained in the robot.cfg file. An example of the file is contained in Appendix B.
Once the software variable values had been set, the robot was placed in a feature,
usually a corridor, aligned to traverse the feature or set of features in as straight a line as
possible, and turned on. The software controlling the robot was then started and the
experiment proceeded. The robot would identify the initial feature it was in while moving but
as subsequent features were identified the robot would stop to give a visual cue that this
51


event had occurred. When the robot reached the prescribed distance to travel it stopped and
reported the features it had detected.
Sonar, feature identification, and debugging data were collected to files to allow
post-experiment analysis. The data collected for an experiment to identify a Corridor are
presented in Appendix B.
The features to be identified were presented in Chapter 3 as part of the feature set.
The results of identifying these individual features are presented in the following sections.
4.2.1 Corridor
A Corridor is composed of two walls and is long and narrow in shape. Figure 4.1 is a
scatter plot of the sonar data collected for a Corridor.
Robot view Show Refresh panels
Uindou tourtfc: LL<-000(K314.-00001803). U?(OOO)15.*OOOC1810)
Actual pcsJtlor: hot available :n real robot aods.
Encoder pcsitior: X=*CWM214 Y=-000CC00C S=0000 T=0O
Ccepass value: COCO
Previous comne: watt, l, 1. 2)
Units: coordinates = o.l inches: angles = 0.1 deyecs
Figure 4.1: Sonar Plot for a Corridor
52


The solid circle in Figure 4.1 represents the position of the robot at the end of the
run. The shapes and range values of predicted and actual sonar data for a Corridor are
plotted in Figure 4.2.
Predicted
Actual
(a) Predicted and Actual Sonar Shapes
Sonar Number
Predicted
Actual
(b) Predicted and Actual Sonar Ranges
Figure 4.2: Predicted and Actual Sonar Data for a Corridor
The important elements of the Corridor feature are large distance ranges to the front
and back of the robot, as reflected by the readings from sonar 0, the sonar which points
directly ahead of the robot, and sonar 8 which points directly behind the robot. Side readings,
from sonar sensors 4 and 12, would be much smaller because of the Corridors walls.
53


While the actual data matches the shape of the predicted data very well it is
interesting to note that the magnitude of the range readings to the front and back are smaller
than would be expected. This is discussed in Section 4.4.
During test runs Corridors were always successfully identified. One run did
demonstrate that if the robot was significantly offset from the mid-line of a hallway the
feature would be identified as an Alcove. This is because the rule set for the Corridor feature
looks for two walls at near range while the alcove feature rules expect one near wall and one
wall at a longer distance away.
Corridors and Alcoves were sometimes identified with equal certainty because the
Corridor Rule in the Corridors rule set was not true. As described in Chapter 3, this rule
looks at the aspect ration of the feature to see if the length is a set threshold longer than the
width. When the front, back, or both sonar sensors reported shorter than expected values,
this rule could be violated. Occasionally, a Corridor would be incorrectly identified as a
Corridor End due to short front range readings. These readings were transient and probably
caused by a strong reflection off of a comer or a highly acoustically reflective object.
4.2.2 Four Way Intersection
Four Way intersections occur when two orthogonal corridors cross each other and
both continue for some distance. Figure 4.3 is a scatter plot of sonar data for a Four Way
intersection.
54


Robot View Show Refresh Paneb
liinfcpy toutfsi L(H>XW4284.-C00C1454). lR(*OW2£S.000C145)
Actual position: fix available in real rubot node.
FnrwW* prwit.im* X=*0000lfc>(' Ys*OfifK000 SsfrttO "aftM
Ctwpass value: 0300
Previous comartd: w!l. 1, 1, 2)
Units: coordinates = 0.1 indies; angles = 0.1 degrees
Figure 4.3: Sonar Plot for a Four Way Intersection
The corridor that the robot was traversing and the comers where the two corridors
meet are easily identifiable. The cross corridor is not so well defined in the sonar plot. This
is not critical since the robot identified the feature as a Four Way intersection based on
having long ranges of open space to the front, back, and sides.
The first experiments to identify a Four Way intersection were not very successful.
The Four Way features were misidentified as Up T or Dual Alcove features. This was
primarily caused by setting the long threshold for the rules to 120 inches and the short
threshold to 60 inches. As noted above, long range readings were not as long as they were
expected to be. The long and short threshold values were lowered to 78 inches and 54 inches,
respectively, based on the data collected in various experiments.
The rule set for the Four Way pattern was also modified. The Four Way comer rule
(FWComerRule) was removed from the rule set. Analysis of sonar data showed that the
reasoning behind the rule was somewhat naive. In looking at a Four Way intersection it could
55


be expected that the comers would return short ranges while the hallways would return long
range readings. In actuality the comer areas return longer range readings that the open
corridors. This was most likely caused by specular effects. Figure 4.4 compares the shapes
and the range values of predicted and actual sonar data for a Four Way intersection.
Predicted
Actual
(a) Predicted and Actual Sonar Shapes
Sonar Number
Predicted
Actual
(b) Predicted and Actual Sonar Ranges
Figure 4.4: Predicted and Actual Sonar Data for a Four Way Intersection
With the changes to the threshold values and the Four Way rule set, these features
were always identified once the robot was into the feature. However, misidentifications do
56


occur frequently on entering and exiting a Four Way intersection. This is mainly caused by
one of the side sonar sensors seeing the cross corridor before the other. The pattern
matched in this circumstance was that of an Across T instead of a Four Way intersection.
Once the other side sensor received long range readings the feature was correctly identified
as a Four Way intersection. Averaging sonar readings together usually eliminated one
incorrect identification but did not eliminate them entirely. An interesting side note was that
the right side sensor (12) consistently detected a new feature first. This does not appear to be
caused by a misalignment of the left or right side sensors. It is possible that the main lobe of
sensor 12 is slightly wider than normal.
The misidentification of a Four Way as an Across T occurred because the front
sensor, back sensor, and a side sensor reported long readings while the other side sensor did
not see a long distance. This illustrates how other misidentifications can occur if one or more
sensor readings for a Four Way intersection do not report long. Instead of a Four Way the
sonar readings could match an Up T, an Across T, an Alcove, or a Dual Alcove.
57


4.2.3 Up T Intersection
Figure 4.5 presents the sonar scatter plot for an Up T intersection feature.
Robot View Show Refresh fanels



iL i<
FT
Hirdow bounce: U.(-<00W2-M,-O01649>, ffi^OOMjiS.^OOWieSo)
RcUal position: Net available in real robot no&.
Encoder position: X-O0C0145? Y-COOOOW S^COOO T-OOX
Cotuass value: (COO
rVeviouj cowad: nj(l, 1, 1, 2)
Units; coordinates 0.1 inches; angles : 0.1 degrees
Figure 4.5: Sonar Plot for an Up T Intersection
The scatter plot shows that the ultrasonic sensors detected the walls of the corridor
that the robot was originally traversing and the wall of the cross corridor thafit was
approaching. The cross corridor is indicated by the open spaces to the right and left of the
robot. Graphs of the shapes and range values for predicted and actual sonar data are
presented in Figure 4.6.
58


2
1
0
15
Predicted
Actual
(a) Predicted and Actual Sonar Shapes
Sonar Number
Predicted
Actual
(b) Predicted and Actual Sonar Ranges
Figure 4.6: Predicted and Actual Sonar Data for an Up T Intersection
The shape of the actual data matches closely with that of the predicted data. The
robot senses long ranges to the sides and rear with an obstacle or obstacles to the front. Once
again, a major difference between expectations and real data was the magnitude of the long
range readings.
As the robot approached an Up T it initially identified the feature as a Corridor. As it
moved closer to the T itself there was a period where the wall blocking its path was below
59


the long threshold but above the short threshold for the rules set. Misidentifications of the
feature occurred during this region.
Runs using one sonar reading for identification indicated that the robot had entered a
corridor end when the front range readings were between the long and short threshold values.
As with the Four Way features sonar 12 (right side) began detecting long ranges a few inches
before the left side sonar. This lead to the identification of the Up T as an L until the left side
sonar began reporting long ranges. If the cross corridor was wide enough then a Four Way
intersection was identified until the range from the robot to the wall in front of it dropped
below the short threshold. At this point the feature was correctly identified as an Up T and
did not change from this identification.
Experiments that averaged six sonar readings together for pattern matching produced
fewer misidentifications. Readings that had the right side sonar reporting long ranges before
the left side sonar were averaged with prior readings that would have matched a Corridor
feature. The result was that the entry into the Up T was identified as an Alcove. The
identification then switched to a Four Way intersection until the range to the cross corridor
wall dropped below the short threshold. The feature was then correctly identified as an Up T
intersection.
4.2.4 Across T Intersection
Across T intersections are related to Up T features by the fact that they are the same
feature but the robot approaches them from different directions. A sonar data plot for an
Across T is presented in Figure 4.7.
60


Figure 4.7: Sonar Plot for an Across T Intersection
The scatter plot very clearly shows the corridor that the robot was traversing. The
hash marks around the open area representing the other corridor are the comers where the
two corridors meet. The two bumps in the corridor on the left of the plot are half columns
which projected into the hallway. The intersecting hallway looks very much like the cross
corridor in the Four Way sonar plot. Comparisons of predicted and actual sonar data are
presented in Figure 4.8.
61


0
Predicted
Actual
(a) Predicted and Actual Sonar Shapes
Sonar Number
Predicted
Actual
(b) Predicted and Actual Sonar Ranges
Figure 4:8: Predicted and Actual Sonar Data for an Across T Intersection
The actual sonar data indicates long ranges to the front and back of the robot. One
side also has long range readings while the other side indicates an obstacle at short range.
The double hump of sonar sensors 2 through 4 may indicate specular reflection off the comer
to the front and left of the robot. The actual and predicted shapes match up very well.
There were no feature misidentifications when the robot transitioned into the Across
T from the Corridor and out of the Across T back into a Corridor in mns using one or six
62


sonar readings for feature matching. However, in the runs which used averaged data single
sonar readings indicating an Alcove or L feature were collected when the T was entered.
These data points were filtered out by the averaging algorithm. As indicated above, these
readings were caused by sonar returns that should have been long but were reported as short.
In these cases an Across T could be misidentified as an Alcove or an L intersection.
4.2.5 L Intersection
A sonar plot for an L intersection would show that the ultrasonic sensors had
detected the corridor that the robot was traversing, the wall of the intersecting corridor in
front of it, and open space to one side where the other corridor was located. Figure 4.9 is a
sonar plot of this type of feature.
Rnlmt Vbtw Show Rnfimh Panot*
Figure 4.9: Sonar Plot for a L Intersection
The half-circle in the corridor is a column that was not flush with the wall. The hash
marks across from the column indicate a doorway with the door closed.
63


For an L intersection it would be expected that the front sonar and one side sonar
would indicate short ranges while the back sonar and the other side sonar would report long
ranges. Graphs comparing actual sonar data to predicted sonar data for an L intersection are
shown in Figure 4.10.
Predicted
Actual
(a) Predicted and Actual Sonar Shapes
Sonar Number
Predicted
Actual
(b) Predicted and Actual Sonar Ranges
Figure 4.10: Predicted and Actual Sonar Data for an L Intersection
The shapes of the actual and predicted data match very well. The difference in
magnitude of the long ranges is not surprising given that this difference was seen before
64


when identifying other features. Another contributor could be the narrowness of the two
corridors forming the L where the tests were performed.
Experiments using raw sonar data, that is single instances of sonar data for feature
pattern matching, had no misidentifications transitioning from the Corridor into the L.
However, it should be noted that sometimes only 50% (2 out of 4) of the L intersection rules
were being matched. This was in the area where the wall in front of the robot was between
the long and short range thresholds and the back range reading dropped below the short
threshold.
Runs averaging six sonar readings together for feature identification displayed much
greater problems with correctly identifying the L intersection. The data shows that the
corridor was correctly identified and at the transition point to the L an L feature was
identified. However, this identification was quickly switched to an Alcove and then to an
Alcove End. While the L had rule matching percentages that equaled the Alcove and the
Alcove End, the algorithm preferred the other features when there was a tie.
At first it appeared that the averaging was causing this misidentification by having a
bad reading or readings affect the average values of the data used for pattern matching. The
real reason was that the range measurements for one of the side sonar sensors, number 4,
reported a value that was equal to the short threshold. It turned out that this value would
cause rules looking for short ranges and rules looking for a range between the short and long
thresholds to be satisfied. If this value only indicated a short range then the L feature would
have been correctly identified. The rules, SidelnterRule and SideShortRule, were modified
65


and the tests were rerun to prove this hypothesis. With these modifications the L intersection
was correctly identified.
4.2.6 Alcove
An Alcove looks something like an Across T except that instead of a long cross
corridor there would be a pushing out of the corridor. The sonar data plot for such a
feature is presented in Figure 4.11.
Robot View Show Refresh Panels
Uiradow bands: U.<-OOOU,-OOOolfci9>, UK*COOCWtt5,*OOCcieiO>
fictwl pcsJUor; Na callable :n real robot nodj.
Encoder position X=*d0011449 Y=-OOOOOOOC SsOOOO rsOODO
Ccspass value; (000
Previous ccmarc: wad. 1, 1. 2)
Units: coordinates = 0.1 inches: angles = 0.1 degrees
Figure 4.11: Sonar Plot for an Alcove
The plot shows the Corridor with the Alcove being the trapezoidal section pushed
out to one side. The expected shape of the sonar data for this type of feature would be to
have long ranges to the front and rear, a short range to one side, and an intermediate range,
one between the short and long thresholds, to the other side. Figure 4.12 compares predicted
and actual sonar data for this feature.
66


1
0
Predicted
Actual
(a) Predicted and Actual Sonar Shapes
Sonar Number
-----Predicted
-----Actual
(b) Predicted and Actual Sonar Ranges
Figure 4.12: Predicted and Actual Sonar Data for an Alcove
Tests using one data reading and six data readings for feature identification correctly
identified the Alcove feature. There were no incorrect feature identifications at the
transitions to and from the Alcove. It should be noted that if an Alcove were shallow enough
it would not be identified as a separate feature but would be considered part of the Corridor.
Deep Alcoves would be classified as Across T intersections.
67


4.2.7 Dual Alcove
A Dual Alcove could be considered halfway between a Corridor and a Four Way
intersection. As Figure 4.13 shows, a Dual Alcove looks like a Four Way intersection but the
ends of the cross corridor are detectable by the robots sonar sensors.
Figure 4.13: Sonar Plot for a Dual Alcove
The expected shape of the sonar data for a Dual Alcove would be to have long
ranges to the front and rear, and intermediate ranges to each side. Figure 4.14 compares
predicted versus actual sonar data.
Predicted
Actual
(a) Predicted and Actual Sonar Shapes
68


250 T
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0
Sonar Number
Predicted
Actual
(b) Predicted and Actual Sonar Ranges
Figure 4.14: Predicted and Actual Sonar Data for a Dual Alcove
The shapes of the predicted and actual data are very consistent. The valleys at
sonar numbers 2, 5, 11, and 14 would appear to be caused by returns off the walls of the
alcoves on either side of the corridor.
In all runs the right side sonar sensor, number 12, detected the start and end of the
dual alcove first. Thus, the identification of features went Corridor, Alcove, Dual Alcove,
Alcove, Corridor. It is possible to have a staggered Dual Alcove, that is, where an Alcove
starts (or ends) on one side of a Corridor before an Alcove on the other side of the Corridor,
but for most of the feature it is a Dual Alcove. Unless the stagger between the starts of the
Alcoves is significant, it would be very difficult to tell the difference between a real
staggered Dual Alcove and one that was created by one side sonar sensor detecting a feature
prior to the other side sonar sensor.
69


4.2.8 Corridor End
Arriving at the end of a Corridor the sonar sensors of the robot would produce an
image like that of Figure 4.15.
Robot '/lew Show Refresh Panels
Uiridu* Uwds. LU-00C04374,-4)00164)/ UR<-O0CO4375,
ffctual position: Not awailabl; in real robot rode.
Encoder position; X=-C0C814 OMOOoOO $=C0 7=0000
Coopass value: 00(0
frevixs command: wsU, 1,1, 2)
Uiits: coordinates = C.l inches: angles = 0.1 eegrees
Figure 4.15: Sonar Plot for a Corridor End
The sides of the Corridor and the end wall are clearly seen. The bulges on both sides at the
start of the Corridor are doorways.
The sonar range readings for a Corridor End would be expected to have a long range
to the rear of the robot and short distances to the front and sides. Figure 4.16 compares
predicted sonar data to actual sonar data collected in a Corridor End.
70


2
1
0
15
3
4
5
6
14
13
12
11
10
Predicted
Actual
(a) Predicted and Actual Sonar Shapes
Sonar Number
Predicted
Actual
(b) Predicted and Actual Sonar Ranges
Figure 4.16: Predicted and Actual Sonar Data for a Corridor End
While the shapes of the predicted and actual data are similar it is apparent that the
range readings to the back of the robot are much less than would be expected. The
explanation for this probably relates to the narrowness of the hallway in which the tests were
run. The comers of the doorways at the start of the corridor also provided much better
surfaces for returns versus flat walls that would have reflected most of the sound energy
away from the sensors.
71


In tests, identification of the feature fluctuated between Corridor and Corridor End
until the range to the wall in front of the robot fell below the short threshold. The changes in
feature identification were driven by the range readings obtained by the rear pointing sonar
sensor. When the readings showed a long range then a Corridor matched that data better.
When short ranges to the back of the robot were reported then Corridor End provided a better
match. Once the range to the front wall fell below the short threshold the Corridor End
feature was a better match than Corridor regardless of the range the rear-facing sensor was
reporting.
4.2.9 Alcove End
The Alcove End feature is a combination of the Corridor End and Alcove features.
This feature terminates a hallway but one side of the feature is wider than the normal
corridor width. Figure 4.17 is a scatter plot of sonar data for a feature of this type.
Figure 4.17: Sonar Plot for an Alcove End
72


As with the Corridor End, the sonar range readings for an Alcove End would be
expected to have a long range to the rear of the robot and a short distance to the front. One
side would have short range readings while the other would indicate intermediate ranges.
Figure 4.18 compares expected sonar data to actual data for an Alcove End.
2
3
4
5
6
0
1 15
14
13
12
11
10
7 9
8
Predicted
Actual
(a) Predicted and Actual Sonar Shapes
Sonar Number
Predicted
Actual
(b) Predicted and Actual Sonar Ranges
Figure 4.18: Predicted and Actual Sonar Data for an Alcove End
Similar to the actual data collected for the Corridor End, the range data to the rear of
the robot for the Alcove End feature is not very long. As with the Corridor End this is most
likely due to the narrowness of the hallway being used for the test.
73


In testing, the feature identification fluctuated between Corridor and Corridor End
until the range to the wall in front of the robot was less than the short threshold. The feature
was then identified as a Corridor End until the Alcove could be observed by a side facing
sonar. When this happened the feature identification was switched to Alcove End.
This misidentification is very similar to what happened in identifying an Up T
intersection. Until the robot was actually in the feature, in this case an Alcove End, the data
it was collecting matched another feature perfectly. Once it was in the feature a correct
identification was possible.
4.2.10 Dual Alcove End
A Dual Alcove End is very similar to an Up T feature. The major difference is that
the ends of the cross corridor are detectable by the robot sonar sensors in a Dual Alcove End.
A sonar scatter plot for this feature is presented if Figure 4.19.
Figure 4.19: Sonar Plot for a Dual Alcove End
74


The sonar range readings for a Dual Alcove End would have a long range to the rear
of the robot and a short distance to the front. Both sides would indicate intermediate ranges.
Figure 4.20 compares predicted sonar data to actual data collected for a Dual Alcove End.
2
3
4
5
6
0
1
7
15
14
13
12
11
10
9
8
Predicted
Actual
(a) Predicted and Actual Sonar Shapes
Sonar Number
Predicted
Actual
(b) Predicted and Actual Sonar Ranges
Figure 4.20: Predicted and Actual Sonar Data for a Dual Alcove End
As with the Corridor End and Alcove End, the shapes of the predicted and actual
data for a Dual Alcove End are similar but the magnitudes are off. Once again this is
probably due to the close confines of the area used for testing.
75


The experiments in identifying this feature followed the pattern of the Alcove End
testing. The feature identification fluctuated between Corridor and Corridor End until the
range to the wall in front of the robot was less than the short threshold. The feature was then
identified as a Corridor End until an Alcove could be observed by the right side facing sonar.
When this happened the feature identification was switched to Alcove End. Once the Alcove
on the left side was seen by the left side sonar sensor the feature was correctly identified as a
Dual Alcove End.
4.3 Results of Multiple Feature Identification
A set of experiments was run to verify that the robot could successfully identify a
series of features as it traversed an indoor environment. The tests were set up similarly to the
description in Section 4.2. The distance to be traversed for these experiments was set at 480
inches (40 feet). Tests were run with one of two values (one or six) for the number of sonar
data readings to average together for feature identification. Using one data reading provided
identification using raw data. Sets of six readings produced filtered or averaged data. The
short range threshold was set to 54 inches and the long range threshold was set to 78 inches.
The series of features that the robot maneuvered through was as follows:
Alcove
Corridor
Double Alcove
Corridor
Four Way Intersection
Corridor
76


One feature that was not part of the region to be traversed but that may have
impacted the test was an alcove end that terminated the corridor beyond the end of the test
area. This alcove end had glass walls on each side and a metal door facing the test region.
This area was acoustically very specular, a fact that became obvious when entering it and
having sounds echo back very strongly. To determine if this area affected the experiments
one test was run with a sound absorbing surface placed in front of it very near to the end of
the forty foot test distance. This.environment is depicted in Figure 4.21. The location where
the sound dampening material was placed is indicated by the dashed line.
Figure 4.22 is a photograph of a portion of the area the robot traversed. The metal
door of the alcove end can be seen in the picture. Figure 4.23 is a sonar plot of this
environment.
Start
End
Figure 4.21: Map of Multiple Feature Environment
77


Figure 4.22: A Portion of the Multiple Feature Environment
78


Figure 4-23: Sonar Plot for Multiple Feature Environment
In all tests the initial Alcove feature was correctly identified with the sonar data
matching the Alcove rule set 100%. The transition from the Alcove to the Corridor occurs at
approximately 44 inches into the run. The robot identified this feature change correctly.
This first Corridor feature was never misidentified though there were occasions
where the Corridor rule set was only matched 50% by the sonar data. This was caused in
locations where the range readings to the back of the robot dropped to 70 90 inches.
Averaging six sonar data readings together for feature identification eliminated this effect.
Entry into the Dual Alcove feature was detected by the right side sonar sensor first in
most tests. This caused the feature to be initially identified as an Alcove when using raw data
for feature identification. This identification changed when the left side Alcove came into
view of the left side sonar sensor. Tests using averaged data identified the Dual Alcove with
no incorrect identification of an alcove.
79


Once both side ultrasonic sensors could detect the Alcoves to either side of the robot
the rules for the Dual Alcove were matched 100%. The trash can at the start of the right hand
side Alcove had no effect other than to cause the right side ranges to report 20 inches shorter
than the left side until the robot completely passed the trash can. The trash can was observed
in the sonar readings for approximately 17-20 inches.
The transition from Dual Alcove to Corridor was detected by the right side sonar
sensor first. This caused an identification of an Alcove in all runs except one that used
averaged data. The Dual Alcove in this case was symmetric so the detection of Alcoves on
entering and leaving the feature were incorrect and not an indication that the Alcoves were
offset.
Immediately after exiting the Dual Alcove all the runs have a location or small
distance where the range'readings to the front of the robot drop. This caused all the raw data
tests and all but one of the averaged data runs to identity a Corridor End feature. This
location was from three to six inches in length. The averaged data experiments did not
always filter this data out due to the fact that the distance over which the data was filtered
was approximately three inches. This misidentification was probably caused by a sonar ping
being received by the forward pointing sensor that had been generated by a previously fired
transducer. The front facing sonar probably mistook this multi-bounce return as the return to
the ping it generated. There-are some comers in the corridor which could have causes the
multi-bounce effect.
After this small area the algorithm correctly identified the Corridor and maintained
this identification until the robot reached the Four Way intersection. Entering into the Four
80


Way intersection all runs except one averaged data test identified an Across T feature. This
was because the front, back, and right side sonar sensors were reporting long distances while
the left side sensor was seeing a short range. The one averaged data test reported an Alcove
instead of an Across T. This misidentification was caused by the fact that half the data
readings matched a Corridor and the other half matched an Across T. Averaging these
readings together yielded ranges that matched an Alcove feature.
Once the left sensor began reporting long ranges the intersection was properly
identified. Approximately three feet into the Four Way intersection front range readings
began to drop to levels that caused the algorithm to identify the feature as either a Four Way
intersection or as an Up T intersection. The feature matching algorithm will select a Four
Way over other features other than a Corridor, so the intersection was still correctly
determined to be a Four Way intersection. The front range readings continued to go up and
down through the rest of the intersection. Using averaged data reduced the impact of these
range fluctuations on feature identification.
One note of interest is that by the time the robot was entering the Four Way
intersection it was obvious that it was pulling to the right. The robot drifted to the right by
four to six inches by the end of a run in all tests. This had never been seen in previous tests
but did not appear to affect the identification of features.
The last feature in the test series was a Corridor. The runs using raw data identified a
transition from a Four Way intersection into a Corridor End instead of a Corridor. The data
shows that the front range readings were fluctuating above and below the short range
81


threshold. The correct identification of a Corridor was made when the front range value
would rise but the majority of readings indicated a Corridor End.
Test runs using averaged data identified an Alcove or Dual Alcove when moving
from the Four Way intersection into the Corridor. Averaging effects account for this as they
did when the robot entered the Four Way and identified the feature as an Alcove. The
averaged data runs that did not have sound deadening material across the Corridor identified
the feature as a Corridor End as the raw data tests had done. The test with the material in the
hallway correctly identified the feature as a Corridor and then identified a Corridor End
when the range to the material dropped below the short range threshold.
Except for the Alcove section and the first Corridor section all the features were
misidentified upon entry or exit in one or more of the runs. The averaged data runs did better
at correctly identifying features and maintaining that identification than the raw data tests.
This performance difference is even greater if single or transient feature identifications are
discarded from the runs. Table 4.1 compares the number of correct feature identifications to
the total number of feature identifications performed for runs using raw and averaged data.
Table 4.2 lists the features identified in the experiments with transient features discarded.
Data Type Number of Identifications Number of Correct Identifications Percent Correct
Raw 388 339 87
Averaged 63 59 94
Table 4.1: Percentage of Correct Feature Identifications
82


Actual Feature Raw Data Tests Averaged Data Tests
Alcove Alcove Alcove
Corridor Corridor Corridor
Dual Alcove Alcove Dual Alcove Alcove Dual Alcove
Corridor Corridor Corridor End Corridor Corridor
Four Way Across T Four Way Four Way
Corridor Corridor Corridor End Corridor Corridor End
Table 4.2: Raw and Averaged Data Test Feature Identification
4.4 Summary of Results
At the beginning of testing the value for the long threshold was set to 120 inches and
the value for the short threshold was 60 inches. Review of data showed that while the short
threshold was reasonable the long threshold was not. The long threshold was reduced to 78
inches (6 6) and the short threshold was slightly reduced to 54 inches (4 6).
The rule sets for the Four Way intersection and the Up T intersection were modified
based on empirical data. The logic of the SideShortRule and SideLongRule were modified
based on data obtained during tests run to identify L intersections. These changes increased
the likelihood of correctly identifying a feature.
Misidentification of features was caused by two sources: the physical layout of an
area such that the sonar sensors on the robot could not see the correct feature and sonar
range readings which did not represent the distance to an obstacle correctly.
83


Examples of the first case were found in identifying Up T intersections and Alcove
and Dual Alcove Ends. At this point it is important to recall that the rules used to identify
features dealt primarily with four specific sensors. These sensors pointed straight forward
(sonar 0), straight to the left (sonar 4), straight back (sonar 8), and straight right (sonar 12).
Thus, the misidentifications in these cases were caused because the end of the Corridor was
detected but the openings that represented a cross corridor or Alcove areas were not. Figure
4.24 illustrates this point. .
Figure 4.24: Robot Detecting a Corridor End Instead of an Up T
Even if the feature identification method used data from all the sonar sensors on the
robot incorrect identifications of this sort could still occur. As the robot approaches a feature
how could it tell an Up T intersection from a Dual Alcove End or an L intersection from an
Alcove End? Figures 4.25 and 4.26 depict these situations.
84


Figure 4.25: Robot Unable to Differentiate an Up T from a Dual Alcove
Figure 4.26: Robot Unable to Differentiate an Alcove End from a L Intersection
Misidentification of features also results from incorrect or misleading range
readings. A corollary cause is expecting certain range readings but not seeing them.
Essentially sonar sensors can only report the ranges they see and the rules for each of the
features can only operate on the data presented by the sensors.
As was noted above, the original range thresholds were set such that objects at
greater than 120 inches would be at a long range and that features closer than 60 inches
85


would be considered to be located at short range. In initial tests a Four Way intersection was
often identified as a Dual Alcove or an Up T. This was due to shorter than expected range
readings to the front or sides. Changing these thresholds greatly reduced the occurrence of
this problem. Figures 4.27 and 4.28 illustrate these situations.
Figure 4.27: Misidentification of a Four Way as a Dual Alcove
Figure 4.28: Misidentification of a Four Way as an Up T
Incorrect feature identifications still occur due to incorrect range readings as
evidenced by the phantom Corridor Ends seen in the multiple feature identification
86


experiments. The sonar readings definitely indicated obstacles at ranges that in actuality
were open spaces. This effect was most likely caused by having greater than normal
reflective surfaces and/or multi-bounce effects. Most incorrect identifications seem to be
confined to the areas where two features meet. Thus moving from a Corridor into a Four
Way intersection can cause the transient identification of an Across T or Alcove feature.
These occurrences were reduced by the use of averaged data to compare to the various
features rule sets.
In summary, all features in the feature set were correctly identified by the
methodology presented in this research. There were problems with misidentifying features at
transitions between features but these were transitory and the use of averaged sonar data
reduced their occurrence. Some misidentifications that occurred wer.e due to incorrect range
readings. It is important to note that the identification of a current feature did not depend on
the identification of a previous or future feature. That is, there was not, and would not be, a
string of incorrectly identified features created because one in a series was improperly
identified. Identification of a feature was, and is, dependent on the instance, or instances, of
sonar data collected and the rule set for the feature that matched this data the best.
87


Full Text

PAGE 1

RULE BASED FEA TIJRE IDENTIFICATION FOR INDOOR TOPOLOGICAL MAPPING USING ULTRASONIC SENSORS by John Osber Wetherbie ill B.S., University of California at Los Angeles, 1984 A thesis submitted to the University of Colorado at Denver in partial fulfillment of the requirements for the degree of Master of Science Computer Science 1999

PAGE 2

This thesis for the Master of Science degree by John Osher Wetherbie ill has been approved by Gita Alaghband Lf-/-9( Date

PAGE 3

Wetherbie ill, John Osher (M.S., Computer Science) Rule-Based Feature Identification for Indoor Topological Mapping Using Ultrasonic Sensors Thesis directed by Assistant Professor Christopher E. Smith ABSTRACT The goal of this study is to develop a method for identifying indoor features such as corridors, alcoves, and T junctions suitable for use in mapping, localization, and navigation. This study differs from the majority of work in the area of feature identification due to the focus on detecting large-scale features, e.g., corridors, alcoves, etc., instead of small-scale features such as corners, edges, and planes. A set of feature patterns are useQ. to represent the real world features to be identified. Each feature pattern contained a set of rules that are matched against sonar observations taken by a mobile robot. Sonar readings are taken at specified distance intervals by the robot and matched against the rule sets of the various patterns. The rules are very simple and involve comparing a sonar range reading to one or two threshold values. This simplicity leads to reduced computational complexity and to a more flexible and extensible feature identification algorithm. The pattern which has the highest percentage of its rules satisfied by the sonar data is selected as the current feature. Tests are conducted to identify single features alone and a series of features that would normally be seen by a robot traversing an indoor environment. Results demonstrate that iii

PAGE 4

the proposed methodology successfully identifies all features in the feature set. Misidentifications can occur at the transitions between features in some cases. These events are reduced by using filtered data for feature identification instead of raw sonar data. This abstract accurately represents the contents of the candidate's thesis. I recommend its publication. IV

PAGE 5

DEDICATION This thesis is dedicated to my wife Michele our two sons Andrew and Ryan and to the loving memory of my mother.

PAGE 6

ACKNOWLEDGEMENTS I would like to thank several people for their contributions to this thesis effort. My thanks to my advisor, Chris Smith for his help in coming up with the topic for this research and his continued guidance during the time it took to complete this work. My greatest thanks goes to my family. First to my wife Michele, who made sure I had the evenings and weekends I needed to complete this research. Next to Andrew who was very good about letting Daddy use the computer when he needed to study. Finally to Ryan, who mostly played quietly and did not put my thesis in his mouth once.

PAGE 7

CONTENTS Chapter 1. futroduction .............................................................. ................... ..................... ....... .......... 1 1.1 Background ......................................................................................................................... 1 1.2 Problem Statement ................................................. .................... .................... ... .............. 4 1.3 Overview .......... .................... ........................ ... ........................ .................... .................. 4 2. Related Work ...... ............. .... ............................ ................................................. ................ . 5 2.1 Overview .......... ................ .................... ........................ ................................................... 5 2.2 Grid-Based Representations ............................................................................................... 7 2.3 Topological Representations ............................................................................................ 14 2.4 Feature Identification ..................... ........... .............................................................. ....... 20 3. Approach ..................................... ... ........................ ... ................ ....... ............................ 28 3.1 Overview ................ .................... ... ............................ ..................................................... 28 3.2 Ultrasonic Sensors .................... ............................ .................... ........................ ........ ... 29 3 .2.1 Theory ................................. .. .......... ...... ......... ......................................... ................... 29 3.2.2 Polaroid 600 Ultrasonic Sensor ............... ... ................................................................. 32 3.3 Robot ................................................................................................................................ 34 3.4 Software ............................................................................... .................... .................... ... 36 3.4.1 Feature Set .................. . . ........................ ....... ................ ................ ....... ............ ...... 39 3.4.2 Rules .............................................................................................................................. 46 vii

PAGE 8

4. Results ................................................................................................................................. 51 4.1 Overview ............................................. ............................................................................. 51 4.2 Results of Individual Feature Identification ..................................................................... 51 4.2.1 Corridor ... ............................... : .................... ..................... ........ ........ ............ ........... 51 4.2.2 Four Way Intersection ........ ..................... .................................................................... 54 4.2.3 Up T Intersection ....... .............................. .................................................................... 58 4.2.4 Across T Intersection ..................... ................................................................................ 60 4.2.5 L Intersection ................................................................................................................. 63 4.2.6 Alcove ........... ................. ... ........ .................................................... ............................ 66 4.2.7 Dual Alcove ......... : ................................ ............ ........................................................... 68 4.2.8 Corridor End ............................ ......................... .................. ..... ...... ............................ 70 4 2.9 Alcove End ............... ." ...................... : ............. : .......................................... .... ................. 72 4.2.10 Dual Alcove End ............................................................... ........ ............ .... ..... .......... 74 4.3 Results ofMultiple Feature Identification ........................................................................ 76 4.4 Summary ofResults ............................................................... .......... ............ ..... ... ... ... .... ... 83 5. Conclusions ........................................................ ... ............... .................... ..... .......... ....... ... 88 5.1 Evaluation ........................................ ................. ................ ................. ....... ..................... 88 5.2 Future Work ................................................................................ .... ................................. 89 5.3 Summary ........................ ... ............... ................ ...... ..................... .... ................................. 91 viii

PAGE 9

Appendix A. Computer Program ................................. .................................. .... ............................. ...... 92 B. Program Output and Configuration File ........................................................................... 221 References ..................................... ... ............. .................... .................................................. 241 lX

PAGE 10

FIGURES Figure 1.1 Map to Nearest Copier ......................................................................................................... 1 2.1 A Simple Grid-Based Representation ....................... ... ... .................................................. 7 2.2 Basic Sensor Model ............................................................................................................. 9 2.3 A Simple Topological Map ............................................................................................... 15 2.4 Staggered Hallway ....................... ........................ ... ... ; ......................................... .......... 19 2.5 Sonar Transmitter and Receiver ................. ... ... ... .................... ....................................... 21 2.6 Angle of Incidence for Planes and Corners ....................................................................... 22 3.l. Sonar Transducer Lobes ........................... ... .... : ....................... ....................................... 30 3.2 Cross Section of a Sonar Beam ............................ ................ ... ................ ... .................. 31 3.3 Sonar Ring of the Scout Robot ....................... .......... .... .............. ................ .................... . 33 3.4 Arrangement of the Sonar Sensors ...................................................... _. ............................. 33 3.5 Nomadic Technologies Scout Robot ............................. ... ......................................... ... . 35 3 6 System Configuration .................................................... ......................................... ... ..... 36 3.7 Flow of the C++ Program ................. ...... .. ............ ..... ......................................... ..... ......... 38 3.8 Indoor Feature Set ............................................................................................................. 40 3.9 Predicted Sonar Data for a Corridor ........................................................................ ........ .41 3.10 Predicted Sonar Data for a Four Way Intersection .................. : .................... ....... ........ .41 3.11 Predicted Sonar Data for an Up T Intersection .................... ......................................... .42 X

PAGE 11

3.12 Predicted Sonar Data for an Across T Intersection ................ ........ ..... ...... . ...... ........ .42 3.13 Predicted Sonar Data for a L Intersection .... .... ......... ............ ......... ........ ..... .... . ... ..... .43 3.14 Predicted Sonar Data for an Alcove ....... ...... ...... ....................... ...... ....... ........ ............ .43 3.15 Predicted Sonar Data for a Dual Alcove ...... ..................... ..................... ........ ....... .... .44 3.16 Predicted Sonar Data for a Corridor End ........................................................................ 44 3.17 Predicted Sonar Data for an Alcove End ... ............ ....... ............. ............ ................. ... .45 3.18 Predicted Sonar Data for a Dual Alcove End ..................... ....................................... ... .45 3.19 match Function from FrontLongRule ....... ..................... ........ ......... .... ........ ....... ...... ... 50 4.1 Sonar Plot for a Corridor ...... ............ ...... ... .... ...... ... ........ .... .... . ..... ....... ... ....... ........ . ... 51 4 2 Predicted and Actual Sonar Data for a Corridor ..... .... ... ...... ... ................ .... ......... ...... .53 4.3 Sonar Plot for a Four Way Intersection ... ..... .... ......................... ........ ........ ............ ........ 55 4.4 Predicted and Actual Sonar Data for a Four Way Intersection ...... ............ ........ ........... .56 4.5 Sonar Plot for an Up T Intersection ....................................... .............................. ............ 58 4.6 Predicted and Actual Sonar Data for an Up T Intersection ... ........ ............ ........ ............ 59 4.7 Sonar Plot for an Across T Intersection .......... ........ ............ ........ ..................... ........ ... 61 4.8 Predicted and Actual Sonar Data for an Across T Intersection .... . ...... ........................... 62 4.9 Sonar Plot for a L Intersection ... ......... . ... ... .. . . ... ........... ... .... ... ................. ........ ........ .... 63 4.10 Predicted and Actual Sonar Data for a L Intersection ..... ............. ........ .... . ... ........ ....... 64 4.11 Sonar Plot for an Alcove ................ ........... ........ ..................... ...................................... 66 4.12 Predicted and Actual Sonar Data for an Alcove . ....................................... ................ ... 67 xi

PAGE 12

4.13 Sonar Plot for a Dual Alcove ........................................................................................... 68 4.14 Predicted and Actual Sonar Data for a Dual Alcove ...................... ................................ 69 4.15 Sonar Plot for a Corridor End .............................................. ............................. ............. 70 4.16 Predicted and Actual Sonar Data for a Corridor End ...................................................... 71 4.17 Sonar Plot for an Alcove End ................................................................................ ......... 72 4.18 Predicted and Actual Sonar Data for an Alcove End .............. ....................................... 73 4.19 Sonar Plot for a Dual Alcove End ................................................................................... 74 4.20 Predicted and Actual Sonar Data for a Dual Alcove End ............................................... 75 4.21 Map ofMultiple Feature Environment ......................... .................................................. 77 4.22 A Portion of the Multiple Feature Environment... ......... : .. .......... ................................... 78 4.23 Sonar Plot for Multiple Feature Environment ... ............................................................. 79 4.24 Robot Detecting a Corridor End Instead of an Up T . ......................... .......... ................ 84 4.25 Robot Unable to Differentiate an Up T from a Dual Alcove .............. ........................... 85 4.26 Robot Unable to Differentiate an Alcove End from a L Intersection .. ........................... 85 4.27 Misidentification of a Four Way as a Dual Alcove ......................................................... 86 4.28 Misidentification of a Four Way as an Up T ............. .............. ................................. .... 86 Xll

PAGE 13

TABLES Table 3 1 Rule to Feature Mapping ................................................................................. ................. 49 4.1 Percentage of Correct Feature Identifications ................................................. ...... ........... 82 4.2 Raw and Averaged Data Test feature Identification ........................................................ 83 xiii

PAGE 14

1. Introduction 1.1 Background Recall the last time you were asked for directions to a particular office or location in a building you work in, say where the nearest copier is located It is likely that the way you answered went something like this: "Go to the T up ahead and turn right. Go to the end of that corridor where itT's and turn left. Go through a corridor intersection and past an alcove on the right. The next room on the left is the copier room. If you come to another T you've gone too far." You could also provide this information graphically in the form of a simple map. The map wouldn't necessarily be to scale but would indicate the features highlighted in the verbal description, as below: _j 0 Figure 1.1: Map to Nearest Copier 1

PAGE 15

Both of these methods for providing directions emphasize the features that a person would see as they proceeded towards the copier. The distance to the location and the time to get there are secondary or irrelevant in most indoor situations. Now, the copier in this case seems pretty far away but you would probably feel fairly certain that the person asking for directions will be able to successfully find the copier without too much effort. What if it was a robot that was "asking" for directions? Would a robot be able to understand concepts like ''corridor," "alcove", and "T"? Would it be able to recognize these features when it came upon them? To perform useful tasks a mobile robot needs a "model of the near world" [19] that represents the world's configuration and the robot's location within this environment. Using a model of it's surroundings a robot can avoid obstacles, identify changes in the environment, and navigate its way to the nearest copying machine. A map is a symbolic construction and is meant to describe discrete entities, objects or places in the environment [7]. The model, or map, that a robot maintains must provide the information needed to do its work properly. There are two main ways to express the model of the real world for a robot: grid based and topological. Grid-based maps are grids and capture, as accuracy allows, the true layout of an environment. The verbal directions and the simple map presented above are topological in nature. Dudek [7] states that "Most humans naturally conceptualize navigation information in both symbolic or topological terms as well as quantitative :rpetric terms, depending on the context, task, and scale." 2

PAGE 16

Topological representations emphasize qualitative and relational information ("Go past two corridor intersections and an alcove on the right.") instead of exact measurements. Janet, et al., [11] look at this as a more intuitive way to represent an environment than approaches that provide specific distance L"lformation. Dudek [7] does make the point that in many cases a topological map that is created and used by a robot does not need to "match up" with the topological map/description that a human would use to describe an area A single room to a person could appear as two or more rooms to a robot. While this difference in perceiving and describing the environment may not be critical for many types of robots, think back to the robot looking for the nearest copier Unless the robot has a map of the building that contains the features a person sees, it will be if not impossible, for it to find the copier This does not preclude the robot from having additional features or information, but it must be able to understand that a room or a hallway that a person sees is a single entity. One of the most popular methods for developing a map of a robot's surroundings is to collect sonar data via ultrasonic sensors. This sonar data can then be used to construct grid-based representations or topological models, as described above, where the data has been manipulated to identify/construct simple geometric shapes that compose the environment [5] [15] [17] [26]. Grid-based maps present the environment as areas that are occupied or unoccupied with a granularity that is defined by the size of the cells that make up the grid. The topological approaches emphasize the relationships between landmarks or sets of primitives: 3

PAGE 17

planes, corners, edges [5] or walls, corners, edges, and cylinders [17] and then attempt to connect them together in a representation of the robot's surroundings. This research presents a method which will allow a robot to identify large scale indoor features such as corridors, alcoves, and T intersections. This approach has the advantage of using features which have semantic value to humans as the building blocks (primitives) of the environmental representation as compared to grid-based and small-scale topological approaches. This will allow a high-level interface to robots which is more in line with the way humans perceive and interact with the environment while being based on constructs that robots can successfully identify directly from sensor data. 1.2 Problem Statement Current research in developing environmental models for mobile robots use grid based approaches, topological maps of"srnall-scale" features, or a combination ofboth. None of these approaches identifies features on the scale of corridors, intersectio}ls, etc., directly. This thesis develops an approach to identify these "large-scale" indoor features directly from observations taken by a mobile robot. 1.3 Overview The next chapter presents related work in the areas of grid-based maps, topological representations, and feature identification. Chapter 3 describes the methodology that was used for this work. Chapter 4 gives the results of the feature identification experiments. Chapter 5 draws conclusions based on the results of this research and suggests possible areas for future work. 4

PAGE 18

2. Related Work 2.1 Overview Robots have two primary ways of expressing knowledge about their surroundings: grid-based representations and topological representations. These maps are used by robots for obstacle avoidance, navigation and localization; that is, for determining where the robot is, within an environment [18] [19]. Not surprisingly, much of the work reviewed in this chapter deals with these topics. The research presented in this thesis diverges from the majority of this work in one important aspect. The focus is to develop an approach which will allow the identification of large-scale features in an indoor environment. Most of the work surveyed deals with the identification of primitive or small-scale features such as corners and planes, if it addresses feature recognition at all. Large-scale features could be used to create a representation of the robot's surroundings for navigation and localization. They would also have the benefit of having a semantic meaning to accompany the representation; that is, a corridor feature not only "looks" like a corridor but it is identified as such. A map with these type of features could be used to help the robot solve the "find the nearest copier" problem presented in Chapter 1. Maps and descriptions using large-scale features are used by humans to provide meaningful directions all the The use of higher-level abstractions/concepts provides a simpler and more compact way of providing information. A person will better understand 5

PAGE 19

and better use the directions "go straight down this corridor to the first alcove on the left" than the directions "go thirty six tiles straight ahead, stop, then tum to the left." Both sets of directions are accurate and provide the means for a person to get to the copier. From that standpoint the directions are equivalent. Now put yourself in the position of the person wanting to find the copier. Do you want to count the number of tiles you have walked over from where you received the directions? What would happen if the distance given was too short or too long? Blindly following the directions would mean that you would go past the copier or stop before you reached your destination. Are the sets of directions still equivalent? It is also possible that the feature-based directions might be incorrect. In this case you are no worse off than you were with directions that specified an incorrect distance to travel. In contrast, it might be easier for a robot to use distance traveled than to identify environmental features to determine if the robot has reached its goal. The robot could be provided with an a priori map of the environment that specified distances. It could be commanded to explore the environment and build a metric-based map itself. In either case the ability of the robot to find the copier is dependent on its localization capability which is based on the ability to determine distance traveled accurately. Depending on this accuracy is not. necessarily reasonable since localization and navigation are the subjects of many current research efforts. Use of a feature-based map will eliminate some, if not all, ofthese problems. The use of features will also aid in interacting with humans. A person would be able to provide directions to and receive directions from a robot in a much more natural way if the information is topological in nature. Using features which have semantic value to humans 6

PAGE 20

and robots as the building blocks for maps will have significant advantages over representations that may have more precise distance information but are more restrictive. 2.2 Grid-Based Representations Grid-based approaches use one or more two-dimensional arrays of cells as the basis for a model of the world. In their simplest form cells that appear occupied are marked as such by assigning them a value of 1. Empty cells have a value ofO. Cells that are located in unexplored territory could be marked as unknown. Figure 2.1 is an example of a simple grid. --X X -X X X X X X -X X X -X X X X X X -X X -X X -X X Figure 2.1: A Simple Grid-Based Representation The grid map above shows what the meeting of two corridors, a L intersection, might look like. Cells that are occupied are marked with an X. Unknown cells are designated by a dash in this example. Unoccupied cells are shown as empty. More sophisticated approaches have probability values associated with each cell. The approaches commonly used to determine and update the probability values of the cells include Bayesian statistics, Dempster-Shafer methods, or variations of these two. For Bayesian reasoning approaches [8] [16] [17] [26] the values are the probability of the cell being occupied and the probability of the cell being unoccupied. For Dempster-7

PAGE 21

Shafer methods [ 18] [23] the values are the support for the cell being occupied and the support for the cell being empty. Howard and Kitchen [10] add a third value relating to the support for the cell being occupied or empty. For both probability updating schemes, it is common for each cell in the grid to have it's probabilities initialized to a specific value [10] [17]. For Bayesian approaches a value of 0.5 for the occupied and unoccupied probabilities is commonly used. The value of0.5 represents the fact that it is equally likely that a cell is occupied as it is unoccupied before any readings have been taken. For Dempster-Shafer a mass distribution of: =0 =0 v = 1 (2.1) could be used. This indicates no support for the cell being occupied, no support for the cell being empty, and support for the cell being occupied or empty. The Bayesian and Dempster-Shafer updating schemes act on data that indicates whether a cell may be empty or occupied. This data is presented to these algorithms via a sensor model. The research documented in [8] [17] provides the basic sensor model that is used with some modifications, by all grid-based approaches. This sensor model has the following properties : 1) If a cell is closer than the range indicated by the sonar measurement then the likelihood of the cell being empty increases. 2) If a cell is farther away than the range indicated by the measurement then the occupied/empty probabilities don't change. Essentially there is nothing that can be determined about the cell. 8

PAGE 22

3) If a cell is at the same distance as the range reading, the cell may be the cause of the return and the probability of occupancy is increased. The amount the probability of occupancy increases is usually inversely proportional to the range of the reading. This "spreads" the probability that one particular cell in an arc of cells is causing the return. This essentially gives more weight to short range readings than to long range readings. Thus a grid map could have areas that are probably empty, probably occupied, and unknown. Figure 2 2 shows an example of this sonar model. ---. Empty Area Occupancy Arc Jl Figure 2.2: Basic Sensor Model Acoustic Axis The Histogramic In-Motion Mapping (HIMM) approach [2] describe.s a simplification of the above model. HIMM only uses a single probability value known as the Certainty Value with a range of 0 to 15. More importantly, only the cells along the acoustic axis of the sensor are updated by a measurement versus the entire beam-width of the sensor. The one cell that has its Certainty Value incremented is the cell at the measured range of the reading. The empty cells between the sensor and the cell at the measured range have their Certainty Values decremented. When a Certainty Value is incremented, its value is increased by three but when the value is reduced the decrement is one. These values for incrementing and decrementing the Certainty Values were arrived at based on experimentation. 9

PAGE 23

An interesting problem that arises with the use of sonar sensors in mapping indoor environments is that most surfaces in an indoor environment are very good acoustic mirrors [3]. If a surface is at an angle to the sensor then the echo return could be partially or completely reflected away from the sensor. This could make the surface appear farther away or even "invisible" to a ultrasonic sensor [3] [10]. The sound pulse could also bounce off multiple surfaces before being received by the sonar sensor. This would also yield a reading that would indicate a surface is farther away than it is in actuality [3] [10] [16]. The approaches outlined in [2] [8] [17] took a simple approach to handling the specular qualities of indoor environments by rejecting range readings above a certain maximum value. This limit was meant to eliminate the problems caused by specular reflection based on the assumption that range readings caused by spt:;cular reflection occur at the maximum range of asonar sensor. Howard and Kitchen [10] use a modified occupancy grid called a response grid to build a model of an indoor environment. A response grid is a two-dimensional array of cells like an occupancy/certainty grid. The modification attempts to provide a more realistic handling of specular reflection to build a more occupancy grid. The basic premise of the response grid is that a cell can appear to be occupied when viewed from one direction but will appear empty (that is, not generate a response) from another. A smooth surface only returns an echo to a sensor when the angle of incidence between the surface and the sonar beam is near zero. At larger angles of incidence the surface will return a weak response or none at all. Thus the same feature at the same location would appear differently based on the location of the sonar sensor. In other grid-based 10

PAGE 24

approaches this would lead to a contradiction since they assume that an occupied cell would appear occupied from all directions. Each cell in the grid can be empty or contain one or more surfaces that will reflect ultrasonic pulses. Cell occupancy is determined by assuming that a cell has an echo return from it, in one or more directions, must have a surface in it and is therefore occupied. This method leaves open the possibility that a cell might have surfaces in it that do not reflect in any direction. Howard and Kitchen state that the conditions that could cause this are rare in indoor environments. Whether a cell is occupied or not is indicated by the state variable Occ that can be set to one of two values: Occ(x,y) = [occupied, unoccupied] (2.2) The response of a cell for a direction is maintained by the state variable Res that can be set to one of two values: =[response, no response] Thus a cell that has one or more responses associated with it will be marked as occupied with some amount of probability based upon the update methodology used. (2.3) Lim and Cho [16] also address the problem of specular reflection and how it can affect the accuracy of a grid-based map. This paper introduced the concept of specular reflection probability and used it to modify the Bayesian updating function for the grid. The specular reflection probability is composed of two components; one is the Range Confidence Factor (RCF) and the other is the orientation probability. 11

PAGE 25

As noted above, specular reflection can cause misleading range information which indicates an obstacle is farther away or that it is not there. As with the approaches in [2] [8] [ 17] this will lead to interpreting range readings toward the maximum range of the sensor as specular reflection. In contrast, instead of using a limit that throws away the data, [16] applies specular reflection probability to reduce the confidence of the reading. The Range Confidence Factor is used to reduce the confidence in the range oflong range readings. This will reduce the amount that a cell's probability of occupancy will be incremented if that cell is at a long range from the sensor. The orientation probability indicates that the surface within a cell is oriented with a certain probability. This probability is determined by collecting data at different locations for the cell. In this sense it is somewhat like summing up all the Res state variables from [10] to determine the orientation of the surface. The orientation probability is used to affect the confidence of a range reading. Remember that the incidence angle of the sensor beam is also a major parameter for specular reflection. If a cell appears to return a response but the orientation probability indicates that the surface orientation is such that the angle of incidence is high then the probability of occupancy should not be incremented as much as if the incidence angle was low. Standard sonar sensors have very poor angular resolution because of their wide beam widths [2] [6] [8] [10] [16] [17] [22]. This makes it is necessary to combine range readings for a cell from a number of different positions to determine the probability of whether a cell is occupied or unoccupied. 12

PAGE 26

As an illustration of how this works assume that one reading indicates that there is an object along an arc at a particular range. A second reading that indicates empty space intersects the first arc. The area of intersection could be considered have a increased probability of being empty or that the surface orientation of the cell is such that it provided a better return from one position than another. This information will cause the probabilities of occupancy and emptiness to be updated for the cells in the two sensor sweeps. As more data is accumulated for a cell the probabilities will approach the real state of the cell. Grid-based approaches suffer from space and time complexity [11] [26], especially as the size of the environment represented grows. Moravec [17] mentions that the grid representation was somewhat reluctantly adopted at Carnegie Mellon's Mobile Robot Laboratory. This reluctance may have been related to the amount of memory required to use a grid-based map. [ 12] and [ 19] address the space and computational complexities of grid-based approaches by using tree structures to reduce the amount of data needed to represent an environment. These tree structures are called Quadtrees and Octrees based on the maximum number of children, four and eight, respectively, that a node in the tree may possess. Quadtrees are commonly used to represent two-dimensional areas while Octrees are normally applied to three-dimensional spaces [12]. The root node of the tree represents the entire grid map and the subsequent generations of child nodes recursively partition the map into smaller and smaller areas. Nodes can represent empty space, occupied space, or partially occupied space. Empty and occupied nodes have no children while partially occupied nodes have the maximum number 13

PAGE 27

of children ( 4 or 8) for the type of tree. In this way a single node can be substituted for a large number of grid elements and reduce the memozy required for the representation [12). To summarize, grid-based approaches are a popular way to represent the environment that a robot is traversing. This is in spite of the specular reflection and angular resolution problems associated with the sonar sensors that are commonly used to provide input to these maps. Ultrasonic sensors are attractive for use because of simplicity, low cost, and distance measurements that are provided directly (8]. These distance measurements can be directly placed into a grid-based map or modified to take into account uncertainties due to specular reflection and angular resolution. Grid-based approaches suffer from space and time complexity as the environment represented grows larger. Tree structures have been used to reduce the memozy required by grid maps. 2.3 Topological Representations A topological map provides a high-level description of an environment that is dependent on the structure and features detectable in that environment [ 19]. A topological map could be a simple list of features [19] but more commonly the underlying implementation is a graph [4] [7] [26]. The nodes in the graph represent interesting locations or features found in the environment. These interesting locations, such as a room or a corridor intersection, act as recognizable landmarks. The arcs of the graph can represent the path between distinct places, like a hail way or a doorway, or the transitions between features. Figure 2.3 shows a simple topological map with arcs representing hailways and indicating "interesting areas", such as offices, intersections etc that were identified in the exploration of an area. 14

PAGE 28

7 1 2 ---6 I 4 3 8 --9 12 5 11 Figure 2.3: A Simple Topological Map Topological models usually provide a more compact representation of the environment than grid-based approaches [7] [26]. They are also somewhat more intuitive than grid-based approaches in the sense that they recognize distinct or recognizable areas in the way a person might [7] [11]. Topological maps express the environment in terms of landmarks such as those mentioned to allow a robot to find the nearest copier machine from Chapter 1. Topological maps can be constructed directly by accumulating nodes and arcs as a robot navigates through an environment or by translating another representation into a topological map. Other representations that have been translated into topological maps include grid-based maps [8] [26] and clusters of points that were assembled by a neural net [11]. Elfes' [8] approach uses a hierarchy of maps The grid-based maps are the first level, called the sensor level. The second level is the geometric level created by identifying groups 15

PAGE 29

of cells with high probabilities of occupancy and labeling them as unique entities, such as desks and chairs. The third level is the symbolic level. The maps at this level are topological in nature and contain information about larger areas than the maps on the lower two levels. Nodes may represent interesting areas where additional information is provided or they could represent corridors Thrun and Bucken [26] also build a topological map of the environment based on a grid-based representation. To accomplish'this the free space of the grid map is partitioned into a number of regions separated by "critical lines." Critical lines correspond to narrow passages such as doorways. The partitioned map is then converted into a graph. The regions are mapped to nodes and the critical lines, that is, the paths between the regions, become arcs. Pure topological approaches often have difficulty telling two different locations that look alike apart, especially if they have been reached by different routes. This is because topological approaches primarily use the robot's position relative to landmarks to determine where they are in the environment. The methods described in [8] and (26] use grid-based and topological maps in combination to distinguish different locations that look alike topologically. Accessing the position data available from a grid map makes differentiating topologically similar locations easier. Janet, et al., [11] use the combination of a neural network and hyper-ellipsoid clustering to create a topological map. Hyper-Ellipsoid Clustering (HEC) groups sets.of points obtained from sonar data into elliptical areas. The elliptical areas filter out outlying data points. By using a hill-climbing algorithm the data presented by the HEC Kohonen 16

PAGE 30

method can be used for location recognition. The approach presented does identify large scale features like corridors as areas of open space but the primary focus is on identifying and utilizing collections of line segments as landmarks. Chong and Kleeman [5] use a grid map and a topological map in concert. The grid map is used for obstacle avoidance while the feature-based map is used for localization. Features are classified as planes, comers, edges, or unknown. Unknown features are not added to the topological map. One interesting note about the grid-based map is that the cells have occupancy and distance information associated with them instead of probabilities of occupancy and emptiness. Bulata and Devy [4] use a hierarchy of models to build a topological representation of an environment, similar to [8] and [26], but do not use grid-baseq maps. Another difference is that the robot in this research uses a laser range finder instead of sonar sensors to acquire information about its environment. The basic building block in this approach is a landmark. Landmarks are defined as a set ofline segments and include doorways with and without doors, portions of corridors, and geometric features found in rooms and corridors. "useful" [4] landmarks in the data are extracted and used to construct the geometric map. The geometrical model contains information about landmarks. The symbolic model groups landmarks into areas. An area corresponds to a complete entity such as a room or a corridor. The topological model is a map composed of areas which define the entire known environment of the robot. 17

PAGE 31

Dudek [7] uses an approach that is very similar to that presented in [26] Grid maps of local areas are merged together into a topological representation. The topological map is composed of metric regions that have position and distance data collected for them, nonmetric regions that serve as parts of a topological link between metric and unla10wn regions that have not been visited by the robot. An office or other "interesting" location would be an example of a metric region. A hallway could be a non-metric area. Line segments are used to model collections of observations of the environment. Each segment can be thought of as representing a section of a wall or other obstacle. Line segments that are close together and parallel or near parallel are merged together. Large-scale features could be generated from the grid maps but are not because this would require domain specific assumptions about the environment. Dudek [7] cites the example of an office building which would require rules and constraints such as typical offices having rectilinear orientations and hallways and doors having standard widths . It is interesting to note that the research presented in this thesis uses rules very much like those mentioned in [7] for identifying large-scale features However,this work doesn t depend on the types of restrictive assumptions that Dudek feels are necessary for performing feature identification. Also in contrast to Dudek, there was no concept of applying this methodology outside of an office building environment. Kunz, Willeke, and Nourbakhsh [14] present a method that constructs a topological map mainly based on the movements of the robot, not on direct usage of sensor data. This paper assumes that the environment the robot is in is rectilinear and that hallways have a particular width for a specific building. 18

PAGE 32

One assumption that was made is that sequential hallway intersections are at least six feet apart. This is supposed to help differentiate between changes in one hallway and a new hallway Thus a "staggered" intersection is not allowed or would not be recognized. Figure 2.4 shows this situation. _j +-___ ___ ... -.... --.. ... ---. Less than 6 feet Figure 2.4: Staggered Hallway The map that the robot produces is a graph that represents the topology of the office building the robot is in. Nodes represent intersections and hallway transitions. Nodes contain location information about location and whether adjacent nodes had been visited. The arcs of the graph have distance information associated with them. The approach in [14] does identify large-scale features. These features fall into three categories: intersections, hallways, and open areas An open area is defined as a path through a physical open area and is differentiated from a hallway based on the expected width of a hallway. This work has more in common with this thesis that any other paper surveyed. It 19

PAGE 33

describes identifying large-scale featw"es and using these features to understand and represent the environment. Topological maps provide a high-level representation of an environment and are usually more compact than grid-based maps modeling the same environment. These models are often used for localization but not for navigation since they do not explicitly represent free space (19] Topological-based approaches have the concept of identifying features but usually on a smaller scale than in this research. One approach [14] does identify hallways, open areas, and intersections but only as artifacts of constructing a topological map. Various methodologies for identifying featw"es are presented in the next section. 2.4 Feature Of the papers that discussed grid-based approaches only Moravec [ 17] described the possibility of identifying objects directly from the grid representation. Tasks such as tracking corridors and identifying doors and desks are mentioned but apparently left for futw"e work. Some topological approaches [4] (11] (14] dealt implicitly or explicitly with assembling and/or identifying "large-scale" indoor featw"es such as corridors or rooms. Most approaches, however, deal with the identification of"small-scale" featw"es. Methodologies for identifying featw"es are presented in this section Barshan and Kuc [1] present a method for distinguishing a planar featw"e, such as a section of wall, from a comer. They describe. an "intelligent sensor" that uses multiple ultrasonic transducers to detect differences in amplitudes and travel times of the pulses emitted by the transducers. 20

PAGE 34

The approach in [1] is based on the concept of a "virtual" receiver. If there are two sonar sensors pointing at each other, one acting as a transmitter and the other acting as a receiver, then it is possible to determine the distance between them using the amplitude of the signal at the receiver and the angle of incidence between the two sensors Figure 2.5 shows the geometry of a transmitter and a receiver. 1-------------------1 e Transmitter .. ------------------:----t---------I 't' !: .................. __ Receiver Figure 2.5: Sonar Transmitter and Receiver Now instead of a transmitter/receiver pair, imagine a single sensor and a feature a plane or a comer, that it is facing. Since indoor features are good reflectors of sound they act as mirrors. Thus, it is possible to "place" a virtual receiver on the other side of the feature and calculate the distance to this phantom sensor because the reflected signal from the feature acts as if it was received by the virtual receiver. Planes and comers are differentiated by the sign of the angle of incidence of the received signal. Figure 2.6 shows this arrangement for planes and comers. 21

PAGE 35

Transmitter Transmitter .............................. (a) Plane (b) Corner Virtual Receiver Virtual Receiver Figure 2.6: Angle of Incidence for Planes and Corners Politis and Probert [24] extend the work of [1] by using a Continuous Transmission Frequency Modulated (CTFM) sonar. This system uses a frequency-based approach to determine the type of feature This method allows for the identification of planes corners, and edges. 22

PAGE 36

The return signal seen by the receiver is mixed with the signal being emitted by the transmitter. Filtering is performed to produce a signal with only one frequency. This frequency is directly proportional to the range of the object that reflected the signal. Planes and comers are differentiated by the sign of the angle of incidence of the signal. Edges only return a fraction of the signal transmitted since most of the energy is reflected away from the sensors Using this approach experiments" were run to map the boundaries of a room. This was successfully accomplished but no attempt was made to classify the room as an object or feature itself. Kleeman and Kuc (13] also extend the work presented in [1]. Similar to [24] this approach is able to distinguish between planes, comers, and edges the concept of an unknown reflector type is also added. This paper identifies edges to include highly curved surfaces along with convex comers. This method does not use amplitude measurements to determine range as in [ 1]. This is because the method in [ 1] will not provide accurate characterization of edge features. By using two transmitters and two receivers the difference in angles of the signal bearings will indicate the type of feature A plane will have a positive difference, a comer will have a negative difference, and an edge will have a value of zero Time of flight information is used in distinguishing edges from planes and comers. If a feature has a low confidence level associated with it, it is classified as unknown. Ohya, Nagashima, and Yuta [22] present work that is very similar to [1] and [13] In this case the primary purpose is to identify walls. This work highlights the fact that it is 23

PAGE 37

possible to find long walls but it is difficult to find small walls, or indentations in walls, that can only be "seen" from a very small area within a room or corridor. Horst [9] presents an algorithm to convert a certainty grid representation into object boundary curves. The object boundary cunres are represented as piece-wise linear segments. These boundary curves could then be used to detect higher-level features. It should be noted, however, that the "higher-level features" in the context of this paper are corners, curves, and lines. The methodology developed by Chong and Kleeman [5] identifies partial planes and corners and then uses a Julier-Uhlmann Kalman Filter (JUKF) to merge these elements into a map of the environment. Newly identified planes are only merged with the partial plane that is adjacent to them to prevent doorways from being closed off erroneously. A partial plane isdescribed by its state parameters, the coordinates of its approximate endpoints, and endpoint status. Endpoint status indicates whether an end terminates with another plane to form a corner. When a wall is first detected it appears as a partial plane with only one endpoint. A corner is characterized by its coordinates only. The technique used in this paper [5] is unable to determine if a corner is concave or convex. To eliminate false targets, the positions from where the robot took observations are maintained. When the map is "sufficiently complete" filtering is performed by checking the line of sight from the observation point to the false target. If a partial plane blocks the line of sight and the false image is far enough away from the plane, the false target can be removed. Localization is performed and then the feature fusion is done using the estimated robot position. Removal of redundant features is also performed. That is, if two partial planes 24

PAGE 38

are adjacent and co-linear, they will be merged into one feature. In this way features can be expanded. Lacroix and Dudek [15] associate the arcs of sonar scans with a set of real world primitive features. This approach has the robot rotate in place for a of revolutions to collect sonar data. This is in contrast to approaches in [1] [5] [13] [22] [24] that identify a small-scale feature by taking data from a single position or multiple positions and apply a squared error or Kalman Filter method to the data. The set of data taken after several rotations of the robot is used to produce a sonar scan. The arcs that are a part of one of these scans are called Regions of Constant Depth (RCDs) because they indicate an object is located at a constant radius along the arc. RCDs are matched against a set of primitive features. The primitive features include Wall, Corner, Edge, Cylinder, Cluttered Area, and Multi-Bounce RCD A Cluttered Area is an object such as a chair, shelves, etc., that generates RCDs but is not identifiable as a Wall, Corner, etc. Multi-bounce RCDs are caused by multiple reflection echo due to the specular nature of the environment. Features are differentiated based on the angular width of the Region of Constant Depth and a priori Bayesian probability density functions. An RCD corresponds to one of the primitives in the above set. A hypothesis is created that represents a correspondence between one RCD and one primitive. It is possible that a single RCD could potentially be a number of different features, so a hypothesis would be created for each of these possibilities. Each hypothesis would have an associated probability of existence. 25

PAGE 39

As the robot moves through the environment scans are taken to produce RCDs from different positions. RCDs developed at different positions are matched to find the Regions of Constant Depth that correspond to the same primitive feature. Matching can also be used to differentiate Wall RCDs from Comer and Edge RCDs since RCD orientation remains the same for Walls but changes for other features. Features are linked together by applying two rules to the current set of identified features. One rule states that each Wall is ended by an edge, a comer, or a cluttered area. The second rule is that each Edge or Comer is supported by two walls that are normal or parallel. Walls, Corners, and Edges are complete when their supporting primitives are identified. Work has also been done in the area of feature identification for Autonomous Underwater Vehicles ( AUV s) [25]. The methodology in this paper makes use of a Kalman Filter to project feature hypotheses into the future. Hypotheses are associated with the type of feature that may be being detected. This implies that there could be multiple hypotheses for one detected feature. Hypotheses are pruned by additional measurements until only one hypotheses is left. As an example, a measurement might indicate that a feature is a plane or a curve. The algorithm assumes the feature is a plane but subsequent predictions are generated both for the feature being a plane and for it being a curve. Once a hypotheses has been satisfied the incorrect branch can be pruned. Hypothesis trees can also be pruned when a set number of hypothesis branches is reached or a time limit is reached. While the authors of [25] state that "distinctive features can be identified and mapped" there is no specific mention of the type of features to be detected. This is somewhat 26

PAGE 40

surprising, especially in light of the fact that the paper represents its approach as a means of encoding a priori knowledge of the environment. Based on the example of trying to identify a plane versus a curve it would be reasonable that the granularity being explored is equivalent to land-based studies that identify corners, walls, etc. In summary, there has been a great deal of work done in the area of feature identification and in the closely related topic of map representation for a mobile robot. The motivation for this research was to improve robot navigation, localization, obstacle avoidance, or a combination of these. This thesis differs from the majority of this work based on the goal of identifying large-scale features such as corridors and intersections instead o.f corners and partial sections of walls. One paper [14] did identify large-scale indoor features explicitly to construct a topological map of an office building. While the number and type of features identified was much simpler in [14] than in this research, it provides confirmation that the identification of complete features is a reasonable approach to describing an indoor office environment. 27

PAGE 41

3.Approach This chapter covers the tools that were used in the performance of this work. Information regarding ultrasonic sensors, the robot platform, "large-scale" features in an indoor environment, and an overview of the C++ software program is presented. 3.1 Overview This research utilizes a mobile robot with a suite of ultrasonic sensors to detect the features of an indoor environment. In defining this research a number of decisions were made regarding the implementation. These decisions are listed below. 1) Make use of an "off-the-shelf' robot with simple sensors versus a specially made robot and/or sensor suite. This would demonstrate that non-specialized robots can perform productive work. 2) Use a simple algorithm for the identification of environmental features. Specifically, the sonar data obtained by the robot would be used directly in identifying features with a minimum of filtering and manipulation. 3) Identify complete or "large-scale" features in the environment as opposed to components of these features such as walls and comers. To perform its task, the robot is instructed to move a specified distance down a corridor. Along the way the robot collects sonar data and processes it. When a feature is detected the robot informs the user of this event. 28

PAGE 42

The following sections present information about the ultrasonic sensors used, the robot, and the software written to control the sonar data collection and feature identification tasks. 3.2 IDtrasonic Sensors 3.2.1 Theory Ultrasonic or sonar (SOund Navigation .And Ranging) sensors are commonly used on mobile robots that are meant to operate in an indoor environment [ 6]. The information provided to a user by the sensor is a single number representing the distance measurement from the sonar transducer to the closest obstacle to it. The ultrasonic sensors used in this research are time-of-flight sensors, i.e., these sensors measure the distance to an object by emitting an energy signal, in this case sound energy, listening for the echo, and dividing the time between the original signal and the return echo by two [21]. R = c(Techo(3.1) where R is the range or radial distance from the sensor to the closest object and c is the speed of sound. The value of c is: c = 331.4(T/273)1 1 2 m/sec where T is the temperature in degrees Kelvin. (3.2) Most sonar transmitters, like Radio Frequency (RF) transmitters, do not emit their energy in an infinitely narrow beam or uniformly in all directions, but rather in a set of 29

PAGE 43

shaped beams, or lobes. The transducers are designed such that the majority of the energy is contained in the main lobe. Figure 3.1 (from [21]) illustrates this point. OdB-dB--40dB--80dB -60 -40 0 20 40 60 Figure 3.1: S
PAGE 44

Sensor Wavefront Range Figure 3.2: Cross Section of a Sonar Beam \ Acoustic Axis Sonar sensors are sensitive to changes in temperature and humidity since these affect the speed of sound in air [6] [8] [20]. As temperature and humidity increase the amount of attenuation an ultrasonic signal will experience increases also. For a frequency of 50 kHz, attenuation can range from 0.6 to 1.8 dB/meter for temperature ranges of 17 to 28 C and relative humidity ranges of 15 to 70 percent [20]. This should be less of a problem in an indoor environment where these factors can be controlled by air conditioning. Indoor environments should provide range readings that will be consistent over time. The surface characteristics of objects in the environment and their orientation to the main beam of the sonar will also affect the range readings of an ultrasonic sensor. Most interior surfaces act as mirrors for sound waves and if the surface is at an angle to the sonar beam, the return echo will be reflected away from the sensor. This will cause the object to appear to be farther away or to not be detected at all [3] [6] [10]. Brown [3] describes the problem as follows: "Using an ultrasonic sensor to look at arbitrary objects in a room is rather like standing in a room completely filled with mirrored objects and having only a penlight glued to your forehead as a source oflight: specularity abounds and many surfaces are not visible." 31

PAGE 45

The reflectivity or absorption characteristics of surfaces also can change the signal intensity of the echo received by the transducer. The differences in signal intensity can influence the rise time of the return echo in the sensor and cause more reflective objects to appear closer[6] 3.2.2 Polaroid 600 Ultrasonic Sensor The Polaroid 600 ultrasonic sensor uses its transducer as both a transmitter and as a receiver. At the start of a ranging cycle a train of 16 transmit pulses at 49.4 kHz is transmitted by the transducer. To eliminate false readings caused by the transducer "ringing" the receiver circuitry is disabled for a brief period. Following the blanking period, the transducer acts as a receiver [20] [21]. The Polaroid 600 sensor can measure distances from 6 inches to 35 feet (420 inches) with an accuracy of+/-1 percent over the range. The system used on the Scout robot restricts the range from 17 inches to 255 inches. The beam width of the sensor is 25 [20] [21]. The sonar configuration of the Nomadic Technologies Scout robot has a ring of 16 Polaroid sensors spaced at 22.5 around the circumference. Figure 3.3 is a picture of the sonar ring. Figure 3.4 shows the arrangement of the sonar sensors in the sonar ring. 32

PAGE 46

Figure 3.3: Sonar Ring of the Scout Robot 1 0 15 3 13 4 12 5 11 7 8 9 Figure 3.4: Arrangement of the Sonar Sensors 33

PAGE 47

3.3 Robot A Nomadic Technologies Scout model was used in this research. An Application Programmer's Interface (API), written in the C programming language, is provided with the robot and provides the means with which to control the robot's movements and to collect data from the sonar ring. Figure 3.5 is a picture of the robot. The robot can be controlled by user supplied software that runs in the Linux operating system environment. This software can run on a laptop computer that is attached to the Scout robot directly or on a workstation that communicates with the Scout via Radio Ethemet. This research used the workstation/RadioEthernet configuration for controlling the robot. Figure 3.6 provides a diagram of this configuration. 34

PAGE 48

Figure 3.5: Nomadic Technologies Scout Robot 35

PAGE 49

Linux Workstation Figure 3.6: System Configuration 3.4 Software The movement and data collection of the robot and the feature identification algorithm are implemented in a C++ software program. This program was developed using the GNU g++ compiler and the Nomadic Technologies API. The program runs in the Linux OS environment. The C++ program begins by initializing the robot in preparation for movement and the collection and processing of sonar data. The robot is commanded to traverse a corridor and collect sonar data at approximately fixed distance intervals (0.5 inches). The raw sonar data sets are collected into groups of specified size (one or six) and are averaged together. The averaged sonar data is compared to a set of feature patterns which represent the various 36

PAGE 50

indoor environmental features. The averaged sonar data is matched against rules, i.e., what the sonar data should look like, that uniquely identify one feature from another. As features are detected position data and the sonar data used for identification are associated with the feature. The detected features are logged and a report of the data collected is generated when the robot reaches the distance it was commanded to travel. For a flow chart of the program, see Figure 3.7. Refer to Appendix A for the program itself. The features mentioned above are. complete features found in an interior environment such as corridors, alcoves, etc. This Feature Set encompasses knowledge of the environment that the robot will be operating in. If the robot detects a feature that does not satisfy any of the available rules then the feature is identified as unknown. While there is no a priori map that the robot can refer to, there is a priori knowledge about the type of environment it will be operating in [25]. Of course, it would be possible to add new features to the Feature Set to handle alternative environments. The Feature Set is presented in the following section. 37

PAGE 51

Start Initialize variables, patterns, and sensors Command robot to traverse environment Collect sonar data Enough data to match against features? No l Filter data Compare data to feature rules Identify feature Environment traversed? t------. No Figure 3.7: Flow of the C++ Program 38

PAGE 52

3.4.1 Feature Set This work is based on the identification of "large-scale" features in an indoor environment. As opposed to being represented as occupied and unoccupied cells in a grid based map or as a collection of planes, corners, etc., the features represent complete entities or components of the environment. Figure 3.8 is the set of features that were expected to be seen in the environment in which the robot was operating The features in this set are geometrically simple, angular, and rectilinear, as can be seen in Figure 3.8. A rectilinear environment implies that the angles where features meet are 90 or 180. The feature identification approach of this thesis was only tested in a rectilinear environment. This is because the test environment, the North Classroom building of the Auraria Campus, is composed of these shapes. Brief descriptions of the members of the feature set are below. Corridor-A feature that appears "long and narrow", i.e., the distances in front and back of the robot are significantly greater than those to the sides of the robot. For this work an aspect ratio of 3 (distance front and back > 3 x distance to the sides) was used (Figure 3.8a). This definition of a corridor eliminates the assumption of a specific corridor width. Figure 3.9 presents a graph of the predicted sonar data for this feature. The graph shows the expected range reading for each ofthe sonar sensors in the robot's sensor ring starting at sonar 0 and "wrapping around" the ring back to sonar 0. 39

PAGE 53

a) Corridor f) Alcove _jL II b) Four Way Intersection .. II d) Across T Intersection g) Dual Alcove c) Up T Intersection e) L Intersection h) Corridor End i) Alcove End j) Dual Alcove End Figure 3.8: Indoor Feature Set 40

PAGE 54

250 200 .c L) c Cll 100 Cl c cu 50 150 I 0 +-+-+-+---ll-f--+-1 -+-+-1 -+--+-1 -+--+1--+-+1-+-1 0 2 4 6 8 10 12 14 0 Sonar Number Figure 3.9: Predicted Sonar Data for a Corridor Four Way-An intersection of two corridors where both corridors continue on for some distance. Around the center of this feature distances to the front back, and sides of the robot are long (Figure 3 .8b). Figure 3.10 shows the predicted sonar data for this feature 250 200 .c .5 150 Cll Cl c cu I I I I I I I 0 2 4 6 8 10 12 14 0 Sonar Number Figure 3.10: Predicted Sonar Data for a Four Way Intersection T Intersection -An intersection of two corridors where one of the corridors ends. For this research two features represent this concept: Up T and Across T. An Up T is detected by 41

PAGE 55

the robot when it is moving up the lower part of the T (Figure 3.8c). An Across Tis the situation where the robot is moving across the top of the T (Figure 3 8d). Predicted sonar data for going up aT intersection and across aT intersection are shown in Figures 3.11 and 3.12 respectively. 250 I 200t 150 & 100 c 50 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number Figure 3 11: Predicted Sonar Data for an Up T Intersection 250 u;2oo G) .t:: u 150 c G) 1:! C) c CIS IX: 0 I I I I I I I I I 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number Figure 3.12: Predicted Sonar Data for an Across T Intersection 42

PAGE 56

L Intersection -An intersection of two corridors where both corridors end. The robot would detect a long distance to the rear and to one side (Figure 3.8e). Predicted sonar data for this feature is shown in Figure 3.13. This graph assumes that the robot is entering the L such that one corridor is behind the robot and the other is to the right of the robot. 250 li) 200 Cl) .r:. g 150 100 r:: /}, 50 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number Figure 3.13: Predicted Sonar Data for a L futersection Alcove-A feature where one side of a corridor is wider than the "normal" width of the corridor for some distance (Figure 3.8f). Figure 3.14 presents the predicted sonar data for this feature. 250 Iii' 200 Cl) .r:. :. 150 Cl) 100 C) r:: /}, 50 0+-+--+-+-+-+-+-+-+-+-+-+-+-+-+-+--1 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number Figure 3.14: Predicted Sonar Data for an Alcove 43

PAGE 57

Dual Alcove Similar to an alcove except that both sides of the corridor are wider than normal (Figure 3.8g). Predicted sonar data for this type of feature is shown in Figure 3.15. 250 en 2oo (I) .s= g 150 :::.. & 100 c: 50 0+--t--+--+--+---+--+-+-+-+-+--t--t--l-----lr-1------1 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number Figure 3.15: Predicted Sonar Data for a Dual Alcove Corridor End-A feature that represents the end of a corridor. It differs from a corridor in the fact that one end is closed (Figure 3.8h). Figure 3.16 presents predicted sonar data for a Corridor End. 250 en 2oo (I) .s= g 150 (I) 100 C) 1: 50 0 I I I I I I I I I I I I o 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 a Sonar Number Figure 3.16: Predicted Sonar Data for a Corridor End 44

PAGE 58

Alcove End -A corridor end with one side wider than is normal for the corridor (Figure 3.8i). Figure 3.17 represents the predicted sonar data set for this feature. 250 I 200! g 150 ::. 100 I & 50 0 1 2 3 4 5 6 7 8 9 101112 13 14 15 0 Sonar Number Figure 3.17: Predicted Sonar Data for an Alcove End Dual Alcove End -A corridor end where both sides are wider than normal for the corridor (Figure 3.8j) A graph of predicted sonar data for a Dual Alcove End is shown in Figure 3.18. 250 Ui 200 G) .c: g 150 G) 100 C) s::: & 50 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number Figure 3.18: Predicted Sonar Data for a Dual Alcove End 45

PAGE 59

As mentioned in the ultrasonic sensor section above, one source of uncertainty in sonar readings is caused by the beam spreading as it gets farther away from the sensor. The approach used in this work to detect features actually makes use of this fact. Patterns can be detected by using sonar readings from different sensors that are taken at same time and location. The mechanism for matching sonar data to a feature pattern is by applying a set of rules that the feature pattern contains to the collected sonar data. The rules of a feature pattern are used to differentiate it from other feature patterns. When sonar data is processed it is matched against the rule sets of each feature pattern. The feature pattern with the highest "matching percentage" identifies the feature that the sonar data described. The matching percentage for a feature is determined by dividing the number of rules satisfied by the total number of rules the feature contains. Thus, if the sonar data satisfies all the rules for a particular feature it will have a matching percentage of 100. Matching Percentage = # of rules matched/# of rules (3.3) If there is more than one feature pattern with the highest matching percentage a simple tie-breaking algorithm is used. If one of the best matches is a corridor, it will be identified as the feature. If one of the best matches is a four way intersection it will be selected unless one of the other candidate feature patterns is a corridor. Otherwise, the first feature detected with the highest matching percentage is identified as the feature. 3.4.2 Rules When sonar data is being compared to the various features to determine which feature the data matches, the range readings are actually being applied to a set of rules that 46

PAGE 60

each feature pattern contains. It is how well the sonar observations conform to one of these rule sets that determines what feature the robot identifies. Brief descriptions of the rules are presented below. Table 3.1 identifies which rules are used by which feature patterns. BackLongRule -This rule is used to check whether the sonar data indicates there is a distance greater than or less than the threshold for "long" distances to the rear of the robot. If the value is greater than the threshold the rule is true. CorrRule -This rule is used to check whether sonar readings indicate that the aspect ratio of the forward and backward pointing readings to the left and right facing readings is above or below a set value. If the aspect ratio is greater than the threshold the rule is true. FrontLongRule -If the reading for the sonar pointing directly forward is greater than the specified "long" threshold then there is open area for a long distance in front of the robot. In this case the rule returns true. FrontShortRule-This rule is used to check for obstacles near the front of the robot. If the reading for the sonar facing directly forward is less than the specified "short" threshold then the rule is true. SidelnterRule-The rule is true if there is an object at a distance between the short and long thresholds to one side of the robot. 47

PAGE 61

SideLongRule -This rule returns true if either the sonar facing directly to the left or directly to the right of the robot indicates a long open area. SideShortRule -This rule is true if the range reading for the sonar sensor facing directly left or the sonar sensor facing directly right is less than a specified threshold. TwoSidelnterRule-This rule returns true if there are obstacles to both sides of the robot at distances that are between the short and long threshold values. TwoSideLongRule -This rule returns true if the sonar sensors that directly point left and right indicate that there are long open areas to the sides of the robot. TwoSideShortRule-This rule is true if the readings for both of the side-facing sensors is less than a specified threshold. 48

PAGE 62

Rule Feature BackLongRule Corridor Four Way Intersection Up T Intersection Across T Intersection L Intersection Alcove Dual Alcove Corridor End Alcove End Dual Alcove End CorrRule Corridor FrontLongRule Corridor Four Way Intersection Across T Intersection Alcove Dual Alcove FrontShortRule Up T Intersection L Intersection Corridor End Alcove End Dual Alcove End SidelnterRule Alcove Alcove End SideLongRule L Intersection Across T Intersection SideShortRule L Intersection Alcove Alcove End TwoSidelnterRule Dual Alcove Dual Alcove End TwoSideLongRule Four Way Intersection Up T Intersection TwoSideShortRule Corridor Corridor End Table 3.1: Rule to Feature Mapping 49

PAGE 63

The match function of a rule performs the test to determine whether the sonar data satisfies the rule or not. Figure 3.19 shows the match function from FrontLongRule. The complete code for all the rules may be found in Appendix A. bool FrontLongRule: :match(const vector &theData) const { return (getFront(theData) > itsThreshold) ? true : false; } Figure 3.19: match Function from FrontLongRule 50

PAGE 64

4. Results 4.1 Overview This chapter covers experiments conducted using the robot and software described in Chapter 3. This system was tested in several runs at different locations within a building. The system was able to properly identify the it traversed. The results of identifying individual features are presented first. A specific experiment to identify multiple features in a single run is then discussed. A summary of the results is then presented. 4.2 Results of Individual Feature Identification A series of experiments was run to ascertain whether the approach described in . Chapter 3 successfully identifies individual features. A typical experiment involved configuring the distance for the robot to travel and the number of sonar readings to average together for feature determination. Experiments were conducted using either single readings or sets of six sonar readings averaged together for feature identification. All these values are contained in'the robot.cfg file. An example ofthe file is contained in Appendix B. Once the software variable values had been set, the robot v.:as placed in a feature, usually a corridor, aligned to traverse the feature or set of features in as straight a line as possible, and turned on. The software controlling the robot was then started and the experiment proceeded. The robot would identify the initial feature it was in while moving but as subsequent features were identified the robot would stop to give a visual cue that this 51

PAGE 65

event had occurred. When the robot reached the prescribed distance to travel it stopped and reported the features it had detected. Sonar, feature identification, and debugging data were collected to files to allow post-experiment analysis. The data collected for an experiment to identify a Corridor are presented in Appendix B. The features to be identified were presented in Chapter 3 as part of the feature set. The results of identifying these individual features are presented in the following sections. 4.2.1 Corridor A Corridor is composed of two walls and is long and narrow in shape. Figure 4.1 is a scatter plot of the sonar data collected for a Corridor. Robot VIew Show Refresh -, .--. : ..... 'z :--. i .. .... _.6'== ..... ., }; Window botnt:: lLHXl004ll4.-oooo1809l. Acwal Not .,..,11-eble :n reol Eneoder po:iticr: X=-000l1.21.t Y:-llOOOCX S:QlOO 1:00)0 (000 a-.!JlC! ws< l l, 1. 2> Uni:.s: c:oordino= = o.l ii'1CI'Ie::: _5')9Jes ; 0.1 deg-ee:s --.--Figure 4.1: Sonar Plot for a Corridor 52 ..-

PAGE 66

The solid circle in Figure 4.1 represents the position of the robot at the end of the run. The shapes and range values of predicted and actual sonar data for a Corridor are plotted in Figure 4.2. 250 "iii' 200 Q) .:= g 150 & 100 c 50 0 1 15 2 14 3 13 --Predicted 4 12 --Actual 5 11 6 10 7 9 8 (a) Predicted and Actual Sonar Shapes 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number (b) Predicted and Actual Sonar Ranges --Predi cted --Actual Figure 4.2: Predicted and Actual Sonar Data for a Corridor The important elements of the Corridor feature are large distance ranges to the front and back of the robot, as reflected by the readings from sonar 0, the sonar which points directly ahead of the robot, and sonar 8 which points directly behind the robot. Side readings, from sonar sensors 4 and 12, would be much smaller because of the Corridor's walls 53

PAGE 67

While the actual data matches the shape of the predicted data very well it is interesting to note that the magnitude of the range readings to the front and back are smaller than would be expected. This is discussed in Section 4.4. During test runs Corridors were always successfully identified. One run did demonstrate that if the robot was significantly offset from the mid-line of a hallway the feature would be identified as an Alcove. This is because the rule set for the Corridor feature looks for two walls at near range while the alcove feature rules expect one near wall and one wall at a longer distance away. Corridors and Alcoves were sometimes identified with equal certainty because the Corridor Rule in the Corridor's rule set was not true. As described in Chapter 3, this rule looks at the aspect ration of the feature to see if the length is a set threshold longer than the width. When the front, back, or both sonar sensors reported shorter than expected values, this rule could be violated. Occasionally, a Corridor would be incorrectly identified as a Corridor End due to short front range readings These readings were transient and probably caused by a strong reflection off of a corner or a highly acoustically reflective object. 4.2.2 Four Way Intersection Four Way intersections occur when two orthogonal corridors cross each other and both continue for some distance. Figure 4.3 is a scatter plot of sonar data for a Four Way intersection. 54

PAGE 68

. : ,.,. c /:-::.:?-:-. --\ \. -. -e -----r----1.' "(_ >-: -J, --Uuldow too.n!!: L<-olilOC84.- fl:tuol position: llr. ovolloble In rool Mbot oode. Frr:rrlor pnc::it.iM: lC:n:rUt>C S:blOO "::{tW'(l OlOO Pr...ioo.n WS':1. :. 1. 2) lllits: coor-dinat:s = 0.1 inc:lles: angles = 0.1 Figure 4.3: Sonar Plot for a Four Way Intersection The corridor that the robot was traversing and the comers where the two corridors meet are easily identifiable. The cross corridor is not so well defined in the sonar plot. This is not critical since the robot identified the feature as a Four Way intersection based on having long ranges of open space to the front, back, and sides. The first experiments to identify a Four Way intersection were not very successful. The Four Way features were misidentified as Up Tor Dual Alcove features. This was primarily caused by setting the long threshold for the rules to 120 inches and the short threshold to 60 inches. As noted above, long range readings were not as long as they were expected to be. The long and short threshold values were lowered to 78 inches and 54 inches, respectively, based on the data collected in various experiments. The rule set for the Four Way pattern was also modified. The Four Way comer rule (FWComerRule) was removed from the rule set. Analysis of sonar data showed that the reasoning behind the rule was somewhat naYve. In looking at a Four Way intersection it could 55

PAGE 69

be expected that the corners would return short ranges while the hallways would return long range readings. In actuality the corner areas return longer range readings that the open corridors. This was most likely caused by specular effects. Figure 4.4 compares the shapes and the range values of predicted and actual sonar data for a Four Way intersection. 250 en 2oo Q) .c g 150 100 1: 50 0 1 15 2 14 3 13 4 12 --Predicted --Actual 5 11 6 10 7 9 8 (a) Predicted and Actual Sonar Shapes --Predicted --Actual 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number (b) Predicted and Actual Sonar Ranges Figure 4.4: Predicted and Actual Sonar Data for a Four Way Intersection With the changes to the threshold values and the Four Way rule set, these features were always identified once the robot was into the feature. However, misidentifications do 56

PAGE 70

occur frequently on entering and exiting a Four Way intersection This is mainly caused by one of the side sonar sensors "seeing" the cross corridor before the other. The pattern matched in this circumstance was that of an Across T instead of a Four Way intersection. Once the other side sensor received long range readings the feature was correctly identified as a Four Way intersection. Averaging sonar readings together usually eliminated one incorrect identification but did not eliminate them entirely. An interesting side note was that the right side sensor ( 12) consistently detected a new feature first. This does not appear to be caused by a misalignment of the left or right side sensors. It is possible that the main lobe of sensor 12 is slightly wider than normal. The misidentification of a Four Way as an Across T occurred because the front sensor, back sensor, and a side sensor reported long readings while the other side sensor did not see a l
PAGE 71

4.2.3 Up T Intersection Figure 4.5 presents the sonar scatter plot for an Up T intersection feature ... -. .... ... --'t;..,;;... .-.. :-.-... ,. -_-:._=-t, -' .. ,_-;> -. -1 <-: ':1. -i!'! .. J( II -/ ________ Win:b.l Hct awil-'lle In robot IIIOCZ. Erw:2der Yo(IOOGOOO( S.COOO T-oox> CCNass ((10() cor-.c:rld: : U 1 1. U nits: c oo nUnates : 0.1 q : es : 0.1 deg-ees Figure 4.5: Sonar Plot for an Up T Intersection The scatter plot shows that the ultrasonic sensors detected the walls ofthe corridor that the robot was originally traversing and the wall of the cross corridor that'it was approaching. The cross corridor is indicated by the open spaces to the right and left of the robot. Graphs of the shapes and range values for predicted and actual sonar data are presented in Figure 4.6. 58

PAGE 72

250 c;; 200 Q) .r:. g 150 & 100 c: 50 0 1 15 2 14 3 13 4 ilP 12 --Predicted --Actual 5 11 6 10 7 9 8 (a) Predicted and Actual Sonar Shapes 01 2 3 4 56 7 8 91011121314150 Sonar Number (b) Predicted and Actual Sonar Ranges --Predicted --Actual Figure 4 6 : Predicted and Actual Sonar Data for an Up T Intersection The shape of the actual data matches closely with that of the predicted data. The robot senses long ranges to the sides and rear with an obstacle or obstacles to the front. Once again, a major difference between expectations and real data was the magnitude of the long range readings. As the robot approached an Up Tit initially identified the feature as a Corridor As it moved closer to the T itself there was a period where the wall blocking its path was below 59

PAGE 73

the long threshold but above the short threshold for the rules set. Misidentifications of the feature occurred during this region Runs using one sonar reading for identification indicated that the robot had entered a corridor end when the front range readings were between the long and threshold values As with the Four Way features sonar 12 (right side) began detecting long ranges a few inches before the left side sonar. This lead to the identification of the Up T as an L until the left side sonar began reporting long ranges If the cross corridor was wide enough then a Four Way intersection was identified until the range from the robot to the wall in front of it dropped below the short threshold At this point the feature was correctly identified as an Up T and did not change from this identification. Experiments that averaged six sonar readings together for pattern matching produced fewer misidentifications Readings that had the right side sonar reporting long ranges before the left side sonar were averaged with prior readings that would have matched a Corridor feature. The result was that the entry into the Up T was identified as an Alcove. The identification then switched to a Four Way intersection until the range to the cross corridor wall dropped below the short threshold The feature was then correctly identified as an Up T intersection. 4.2.4 Across T Intersection Across T intersections are related to Up T features by the fact that they are the same feature but the robot approaches them from different directions A sonar data plot for an Across Tis presented in Figure 4.7. 60

PAGE 74

1 11: !J .-;;. =. Y indow bcu-do: lLHXXl04l14,-oooo1809), Actual posltlcr; tkX. aonslal>le :n n:e:l r cbot Encoder posltlcr: Y=-Q()()(OOOC S--ooo4 1=0())4 tompass v&lue: tM Pre-JiOU$ ws( l 1. 1. 2) tOJrdinatee, = O.l inches; angles = 0.1 deg"!:es ------.. .. .. .... .. . .. ''-".''\.' ... ;: /,r ... -.... '.t Figure 4.7: Sonar Plot for an Across T Intersection The scatter plot very clearly shows the corridor that the robot was traversing. The hash marks around the open area representing the other corridor are the corners where the two corridors meet. The two bumps in the corridor on the left of the plot are half columns which projected into the hallway. The intersecting hallway looks very much like the cross corridor in the Four Way sonar plot. Comparisons of predicted and actual sonar data are presented in Figure 4.8. 61

PAGE 75

250 en 2oo Cl) .c::: g 150 ::::. 0 1 15 2 14 3 13 4 12 --Predicted --Actual 5 11 6 10 7 9 8 (a) Predicted and Actual Sonar Shapes --Predicted Cl) l --Actual 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number (b) Predicted and Actual Sonar Ranges Figure 4;8: Predicted and Actual Sonar Data for an Across T Intersection The actual sonar data indicates long ranges to the front and back of the robot. One side also has long range readings while the other side indicates an obstacle at short range. The double hump of sonar sensors 2 through 4 may indicate specular reflection off the corner to the front and left of the robot. The actual and predicted shapes match up very well. There were no feature misidentifications when the robot transitioned into the Across T from the Corridor and out of the Across T back into a Corridor in runs using one or six 62

PAGE 76

sonar readings for feature matching. However, in the runs which used averaged data single sonar readings indicating an Alcove or L feature were collected when the T was entered. These data points were filtered out by the averaging algorithm. As indicated above, these readings were caused by sonar returns that should have been long but were reported as short. fu these cases an Across T could be misidentified as an Alcove or an L intersection. 4.2.5 L Intersection A sonar plot for an L intersection would show that the ultrasonic sensors had detected the corridor that the robot was traversing, the wall of the intersecting corridor in front of it, and open space to one side where the other corridor was located. Figure 4.9 is a sonar plot of this type of feature .J ' ____, aur-a... b:ud:: 1.12C.O::oo3t90,-.xl00lno l:k:Wil Kot ..,dlflbl h ,...1 robot. -.ocM. Enccder V=))()OolOOO S=0001 1::0001 0000 Previous c::a.ord: wsC1, 1, 1 2> Lhib: 0.1 0.1 Figure 4.9: Sonar Plot for a L futersection The half-circle in the corridor is a column that was not flush with the wall. The hash marks across from the column indicate a doorway with the door closed. 63

PAGE 77

For an L intersection it would be expected that the front sonar and one side sonar would indicate short ranges while the back sonar and the other side sonar would report long ranges. Graphs comparing actual sonar data to predicted sonar data for an L intersection are shown in Figure 4.10. 250 li) 200 Cl> 150 & 100 c: 0 1 15 2 14 3 13 --Predicted 4 f!P 12 --Actual 5 11 6 10 7 9 8 (a) Predicted and Actual Sonar Shapes --Predicted --Actual 50 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number (b) Predicted and Actual Sonar Ranges Figure 4.10: Predicted and Actual Sonar Data for an L Intersection The shapes of the actual and predicted data match very well. The difference in magnitude of the long ranges is not surprising given that this difference was seen before 64

PAGE 78

when identifying other features. Another contributor could be the narrowness of the two corridors forming the L where the tests were performed. Experiments using raw sonar data, that is single instances of sonar data for feature pattern matching, had no misidentifications transitioning from the Corridor into the L. However, it should be noted that sometimes only 50% (2 out of 4) of the L intersection rules were being matched. This was in the area where the wall in front of the robot was between the long and short 'range thresholds and the back range reading dropped below the short threshold. Runs averaging six sonar readings together for feature identification displayed much greater problems with correctly identifying the L intersection. The data shows that the corridor was correctly identified and at the transition point to the L an L feature was identified. However, this identification was quickly switched to an Alcove and then to an Alcove End. While the L had rule matching percentages that equaled the Alcove and the Alcove End, the algorithm preferred the other features when there was a tie. At first it appeared that the averaging was causing this misidentification by having a bad reading or readings affect the average values of the data used for pattern matching. The real reason was that the range measurements for one of the side sonar sensors, number 4, reported a value that was equal to the short threshold. It turned out that this value would cause rules looking for short ranges and rules looking for a range between the short and long thresholds to be satisfied. If this value only indicated a short range then the L feature. would have been correctly identified. The rules, SidelnterRule and SideShortRule, were modified 65

PAGE 79

and the tests were rerun to prove this hypothesis. With these modifications the L intersection was correctly identified. 4.2.6 Alcove An Alcove looks something like an Across T except that instead of a long cross corridor there would be a "pushing out" of the corridor. The sonar data plot for such a feature is presented in Figure 4.11. Robot VIdow bo.nd:: U.HXl00014,-18W Actual posJtlor: rtct a.nllable :n real robot JI01:. E=dor pcxsltt Uni:.s: tQCir"(Sinote;$ = 0.1 inches: 519les = 0.1 deg"eeS . r Figure 4.11: Sonar Plot for an Alcove The plot shows the Corridor with the Alcove being the trapezoidal section pushed out to one side. The expected shape of the sonar data for this type of feature would be to have long ranges to the front and rear, a short range to one side, and an intermediate range, one between the short and long thresholds, to the other side. Figure 4.12 compares predicted and actual sonar data for this feature. 66

PAGE 80

250 en 2oo Cll .J: (,) 150 .: -. & 100 c 50 0 1 15 2 14 3 13 4 12 5 11 6 10 7 9 8 (a) Predicted and Actual Sonar Shapes 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number (b) Predicted and Actual Sonar Ranges --Predicted --Actual --Predicted --Actual Figure 4.12 : Predicted and Actual Sonar Data for an Alcove Tests using one data reading and six data readings for feature identification correctly identified the Alcove feature. There were no incorrect feature identifications at the transitions to and from the Alcove. It should be noted that if an Alcove were shallow enough it would not be identified as a separate feature but would be considered part of the Corridor. Deep Alcoves would be classified as Across T intersections. 67

PAGE 81

4.2.7 Dual Alcove A Dual Alcove could be considered halfway between a Corridor and a Four Way intersection. As Figure 4.13 shows, a Dual Alcove looks like a Four Way intersection but the ends of the cross corridor are detectable by the robot's sonar sensors. ..,: --. . --' \.._ ---<:":'! ::;.._... .... -----------. \: "?!" :!>. ... r-7" -:_ Y i"""' l:lulch: L(-()l004l84.-wc1C14S4L Actual po
PAGE 82

250 'iii 200 GJ .c g 150 100 c 50 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number (b) Predicted and Actual Sonar Ranges --Predicted --Actual Figure 4.14: Predicted and Actual Sonar Data for a Dual Alcove The shapes of the predicted and actual data are very consistent. The "valleys" at sonar numbers 2, 5, 11, and 14 would appear to be caused by returns off the walls of the alcoves on either side of the corridor. fu all runs the right side sonar sensor, number 12, detected the start and end of the dual alcove first. Thus, the identification of features went Corridor, Alcove, Dual Alcove, Alcove, Corridor. It is possible to have a staggered Dual Alcove, that is, where an Alcove starts (or ends) on one side of a Corridor before an Alcove on the other side of the Corridor, but for most of the feature it is a Dual Alcove. Unless the stagger between the starts of the Alcoves is significant, it would be very difficult to tell the difference between a real staggered Dual Alcove and one that was created by one side sonar sensor detecting a feature prior to the other side sonar sensor. 69

PAGE 83

4.2.8 Corridor End Arriving at the end of a Corridor the sonar sensors of the robot would produce an image like that ofFigure 4.15. r i '.$ .... ._i ..... .. .. --. O "'. y.._;..;l .. . I Mi.td.l lA.u..b posit:on: Hot. in rea: robot Jode. &coc1!:r' posl t : on; X;.(C(IOC614 t=.c:>>oooooo S=OOCO T:OOOO Ccapass value: ()()(Q ff'evi::US w:s(l, 1 1. lhits: = C.1 inches: angle$ = 0.1 cegrees Figure 4.15: Sonar Plot for a Corridor End The sides of the Corridor and the end wall are clearly seen. The bulges on both sides at the start of the Corridor are doorways The sonar range readings for a Corridor End would be expected to have a long range to the rear of the robot and short distances to the front and sides. Figure 4.16 compares predicted sonar data to actual sonar data collected in a Corridor End. 70

PAGE 84

0 1 15 2 14 3 13 --Predicted 4 12 --Actual 5 11 6 10 7 9 8 (a) Predicted and Actual Sonar Shapes 250 roo! :. 150 1 Q) 100 g> I 50 --Predicted --Actual 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number (b) Predicted and Actual Sonar Ranges Figure 4.16: Predicted and Actual Sonar Data for a Corridor End While the shapes of the predicted and actual data are similar it is apparent that the range readings to the back of the robot are much less than would be expected. The explanation for this probably relates to the narrowness of the hallway in which the tests were run The corners of the doorways at the start of the corridor also provided much better surfaces for returns versus flat walls that would have reflected most of the sound energy away from the sensors. 71

PAGE 85

In tests, identification of the feature fluctuated between Corridor and Corridor End until the range to the wall in front of the robot fell below the short threshold. The changes in feature identification were driven by the range readings obtained by the rear pointing sonar sensor. When the readings showed a long range then a Corridor matched that data better. When short ranges to the back of the robot were reported then Corridor End provided a better match. Once the range to the front wall fell below the short threshold the Corridor End feature was a better match than Corridor of the range the rear-facing sensor was reporting 4.2.9 Alcove End The Alcove End feature is a combination of the Corridor End and Alcove features. This feature terminates a hallway but one side of the feature is wider than the normal corridor width. Figure 4.17 is a scatter plot of sonar data for a feature of this type. I WirW111 lMaP.h:. lLC""'(llCC<4374,-oml.7&4), pcsit;on: Not in rea: robot. tode. &c:ocL.or PG$1 t:oo: X=...coooce.l:S S=OOCO T=CIOOO Cc:lllpa= value: OC((l C'OIIIMld: ws
PAGE 86

As with the Corridor End, the sonar range readings for an Alcove End would be expected to have a long range to the rear of the robot and a short distance to the front. One side would have short range readings while the other would indicate intermediate ranges. Figure 4 .18 compares expected sonar data to actual data for an Alcove End. 250 'iii 200 Q) .c: u 150 .E 100 c 111 c::: 0 1 15 2 14 3 13 4 12 5 11 6 10 7 9 8 (a) Predicted and Actual Sonar Shapes 0 1 2 3 4 5 6 7 8 9 10 1.1 12 13 14 15 0 Sonar Number (b) Predicted and Actual Sonar Ranges --Predicted --Actual --Predicted --Actual Figure 4.18: Predicted and Actual Sonar Data for lin Alcove End Similar to the actual data collected for the Corridor End, the range data to the rear of the robot for the Alcove End feature is not very long. As with the Corridor End this is most likely due to the narrowness of the hallway being used for the test. 73

PAGE 87

In testing, the feature identification fluctuated between Corridor and Corridor End until the range to the wall in front of the robot was less than the short threshold. The feature was then identified as a Corridor End until the Alcove could be observed by a side facing sonar. When this happened the feature identification was switched to Alc?ve End. This misidentification is very similar to what happened in identifying an Up T Until the robot was actually in the feature in this case an Alcove End, the data it was collecting matched another feature perfectly. Once it was in the feature a correct identification was possible. 4.2.1 0 Dual Alcove End A Dual Alcove End is very similar to an Up T feature. The major difference is that the ends of the cross corridor are detectable by the robot sonar sensors in a Dual Alcove End. A sonar scatter plot for this feature is presented if Figure 4.19 ..1 I .: \ -. e-j J Wiu.k.l Wr.aa.b. .. -ol001"64>. l.Rc-.ooc04m,.oxm'S!i> A:tual po::i t :O'I: tbt .1\lailabh in rea: robot 'ode. pot I t :m; X..-..coooc:608 t:.C>XIOOOOO S=OOCO T:OOOO ro:o Prevl:l.IS COIIAJ"'l: W:S(l, 1 1 lh1t.:: coordinote: : C.1 i.rd!es:: a"l9les : 0.1 cegrees -:..__, Figure 4.19: Sonar Plot for a Dual Alcove End 74

PAGE 88

The sonar range readings for a Dual Alcove End would have a long range to the rear of the robot and a short distance to the front. Both sides would indicate intermediate ranges. Figure 4.20 compares predicted sonar data to actual data collected for a Dual Alcove End. 0 1 15 2 14 3 13 4 tJ --Predicted 12 --Actual 5 11 6 10 7 9 8 (a) Predicted and Actual Sonar Shapes 250 j 2001 :. 150 8, 100 c 50 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 0 Sonar Number (b) Predicted and Actual Sonar Ranges --Predicted --Actual Figure 4.20: Predicted and Actual Sonar Data for a Dual Alcove End As with the Corridor End and Alcove End, the shapes of the predicted and actual data for a Dual Alcove End are similar but the magnitudes are off. Once again this is probably due to the close confines of the area used for testing. 75

PAGE 89

The experiments in identifying this feature followed the pattern of the Alcove End testing. The feature identification fluctuated between Corridor and Corridor End until the range to the wall in front of the robot was less than the short threshold. The feature was then i dentified as a Corridor End until an Alcove could be observed by the right side facing sonar. When this happened the feature identification was switched to Alcove End. Once the Alcove on the left side was seen by the left side sonar sensor the feature was correctly identified as a Dual Alcove End. 4.3 Results of Multiple Feature Identification A set of experiments was run to verify that the robot could successfully identify a series of features as it traversed an indoor environment. The tests were set up similarly to the description in Section 4.2 The distance to be traversed for these experiments was set at 480 inches ( 40 feet). Tests were run with one of two values (one or six) for the number of sonar data readings to average together for feature identification. Using one data reading provided identification using raw data. Sets of six readings produced filtered or averaged data. The short range threshold was set to 54 inches and the long range threshold was set to 78 inches. The series of features that the robot maneuvered through was as follows: Alcove Corridor Double Alcove Corridor Four Way Intersection Corridor 76

PAGE 90

One feature that was not part of the region to be traversed but that may have impacted the test was an alcove end that terminated the corridor beyond the end of the test area. This alcove end had glass walls on each side and a metal door facing the test region. This area was acoustically very specular, a fact that became obvious when entering it and having sounds echo back very strongly. To determine if this area affected the experiments one test was run with a sound absorbing surface placed in front of it very near to the end of the forty foot test distance. This environment is depicted in Figure 4.21 The location where the sound dampening material was placed is indicated by the dashed line. Figure 4.22 is a photograph of a portion of the area the robot traversed. The metal door of the alcove end can be seen in the picture. Figure 4.23 is a sonar plot of this environment. lo Start Figure 4.21: Map of Multiple Feature Environment 77

PAGE 91

Figure 4.22: A Portion of the Multiple Feature Environment 78

PAGE 92

-----------------------::;------Uirdool ban:ls! U.<->l02254,- Previous -..-.l: os
PAGE 93

Once both side ultrasonic sensors could detect the Alcoves to either side of the robot the rules for the Dual Alcove were matched 1 00% The trash can at the start of the right hand side Alcove had no effect other than to cause the right side ranges to report 20 inches shorter than the left side until the robot completely passed the trash can. The trash can was observed in the sonar readings for approximately 17 20 inches. The transition from Dual Alcove to Corridor was detected by the right side sonar sensor first. This caused an identification of an Alcove in all runs except one that used averaged data The Dual Alcove in this case was symmetric so the detection of Alcoves on entering and leaving the feature were incorrect and not an indication that the Alcoves were offset. Immediately after exiting the Dual Alcove all the runs have location or small distance where the range readings to the front of the robot drop. This caused all the raw data tests and all but one of the averaged data runs to identify a Corridor End feature. This location was from three to six inches in length. The averaged data experiments did not always filter this data out due to the fact that the distance over which the data was filtered was approximately three inches. This was probably caused by a sonar ping being received by the forward pointing sensor that had been generated by a previously fired transducer. The front facing sonar probably mistook this multi-bounce return as the return to the ping it generated. There are some corners in the corridor which could have causes the multi-bounce effect. Mter this small area the algorithm correctly identified the Corridor and maintained this identification until the robot reached the Four Way intersection. Entering into the Four 80

PAGE 94

Way intersection all runs except one averaged data test identified an Across T feature. This was because the front, back, and right side sonar sensors were reporting long distances while the left side sensor was seeing a short range. The one averaged data test reported an Alcove instead of an Across T. This misidentification was caused by the fact that. half the data readings matched a Corridor and the other half matched an Across T. Averaging these readings together yielded ranges that matched an Alcove feature. Once the left sensor began reporting long ranges the intersection was properly identified. Approximately three feet into the Four Way intersection front range readings began to drop to levels that caused the algorithm to identify the feature as either a Four Way intersection or as an Up T intersection. The feature matching algorithm will select a Four Way over other features other than a Corridor, so the intersection was still correctly determined to be a Four Way intersection. The front range readings continued to go up and down through the rest of the intersection. Using averaged data reduced the impact of these range fluctuations on feature identification. One note of interest is that by the time the robot was entering the Four Way intersection it was obvious that it was "pulling" to the right. The robot drifted to the right by four to six inches by the end of a run in all tests. This had never been seen in previous tests but did not appear to affect the identification of features. The last feature in the test series was a Corridor. The runs using raw data identified a transition from a Four Way intersection into a Corridor End instead of a Corridor. The data shows that the front range readings were fluctuating above and below the short range 81

PAGE 95

threshold. The correct identification of a Corridor was made when the front range value would rise but the majority of readings indicated a Corridor End. Test runs using averaged data identified an Alcove or Dual Alcove when moving from the Four Way intersection into the Corridor. Averaging effects account for this as they did when the robot entered the Four Way and identified the feature as an Alcove. The averaged data runs that did not have sound deadening material across the Corridor identified the feature as a Corridor End as the raw data tests had done. The test with the material in the hallway correctly identified the feature as a Corridor and then identified a Corridor End when the range to the material dropped below the short range threshold. Except for the Alcove section and the first Corridor section all the features were misidentified upon entry or exit in one or more of the runs. The averaged data runs did better at correctly identifying features and maintaining that identification than the raw data tests. This performance difference is even greater if single or transient feature identifications are discarded from the runs. Table 4.1 compares the number of correct feature identifications to the total number of feature identifications performed for runs using raw and averaged data. Table 4.2 lists the features identified in the experiments with transient features discarded. Number of Number of Correct Percent Data Type Identifications Identifications Correct Raw 388 339 87 Averaged 63 59 94 Table 4.1: Percentage of Correct Feature Identifications 82

PAGE 96

Actual Raw Data Averaged Data Feature Tests Tests Alcove Alcove Alcove Corridor Corridor Corridor Dual Alcove Alcove Dual Alcove Dual Alcove Alcove Corridor Corridor Corridor Corridor End Corridor Four Way Across T Four Way Four Way Corridor Corridor Corridor Corridor End Corridor End Table 4.2: Raw and Averaged Data Test Feature Identification 4.4 Summary of Results At the beginning of testing the value for the long threshold was set to 120 inches and the value for the short threshold was 60 inches. Review of data showed that while the short threshold was reasonable the long threshold was not. The long threshold was reduced to 78 inches (6' 6") and the short threshold was slightly reduced to 54 inches (4' 6"). The rule sets for the Four Way intersection and the Up T intersection were modified based on empirical data. The logic of the SideShortRule and SideLongRule were modified based on data obtained during tests _run to identify L intersections. These changes increased the likelihood of correctly identifying a feature. Misidentification of features was caused by two sources: the physical layout of an area such that the sonar sensors on the robot could not "see" the correct feature and sonar range readings which did not represent the distance to an obstacle correctly. 83

PAGE 97

Examples of the first case were found in identifying Up T intersections and Alcove and Dual Alcove Ends. At this point it is important to recall that the rules used to identify features dealt primarily with four specific sensors. These sensors pointed straight forward (sonar 0), straight to the left (sonar 4), straight back (sonar 8), and straight right (sonar 12). Thus, the misidentifications in these cases were caused because the end of the Corridor was detected but the openings that represented a cross corridor or Alcove areas were not. Figure 4.24 illustrates this point. Figure 4.24: Robot Detecting a Corridor End Instead of an Up T Even if the feature identification method used data from all the sonar sensors on the robot incorrect identifications of this sort could still occur. As the robot approaches a feature how could it tell an Up T intersection from a Dual Alcove End or an L intersection from an Alcove End? Figures 4 25 and 4.26 depict these situations. 84

PAGE 98

Figure 4.25: Robot Unable to Differentiate an Up T from a Dual Alcove Figure 4.26: Robot Unable to Differentiate an Alcove End from a L Intersection Misidentification of features also results from incorrect or misleading range readings. A corollary cause is expecting certain range readings but not seeing them. Essentially sonar sensors can only report the ranges they "see" and the rules for each of the features can only operate on the data presented by the sensors. As was noted above the original range thresholds were set such that objects at greater than 120 inches would be at a long range and that features closer than 60 inches 85

PAGE 99

would be considered to be located at short range. In initial tests a Four Way intersection was often identified as a Dual Alcove or an Up T. This was due to shorter than expected range readings to the front or sides. Changing these thresholds greatly reduced the occurrence of this problem. Figures 4.27 and 4.28 illustrate these situations. Figure 4.27: Misidentification of a Four Way as a Dual Alcove Figure 4.28: Misidentification of a Four Way as an Up T Incorrect feature identifications still occur due to incorrect range readings as evidenced by the phantom Corridor Ends seen in the multiple feature identification 86

PAGE 100

experiments. The sonar readings definitely indicated obstacles at ranges that in actuality were open spaces. This effect was most likely caused by having greater than normal reflective surfaces and/or multi-bounce effects. Most incorrect identifications seem to be confined to the areas where two features meet. Thus moving from a Corridor into a Four Way intersection can cause the transient identification of an Across Tor Alcove feature. These occurrences were reduced by the use of averaged data to compare to the various feature's rule sets. In summary, all features in the feature set were correctly identified by the methodology presented in this research. There were problems with misidentifying features at transitions between features but these were transitory and the use of averaged sonar data reduced their occurrence. Some misidentifications that occurred wer.e due to incorrect range readings. It is important to note that the identification of a current feature did not depend on the identification of a previous or future feature. That is, there was not, and would not be, a string of incorrectly identified features created because one in a series was improperly identified. Identification of a feature was, and is, dependent on the instance, or instances, of sonar data collected and the rule set for the feature that matched this data the best. 87

PAGE 101

5. Conclusions 5.1 Evaluation The purpose of this thesis was to develop a simple approach for the identification of "large-scale" features in an indoor environment using ultrasonic sensors. This goal has been met with some success. All the features in the feature set were correctly identified and distinguished from each other. Identification of features was achieved not only during tests to detect and identify one type of feature but also in tests that had the robot traverse areas of a building that had multiple features in them. The correct identification of a feature was not dependent on correctly identifying features before or after the current one. Misidentification of features was minimized by relatively minor changes to the logic and threshold settings of the program. Incorrect identifications were primarily restricted to areas where two features met and were transitory in nature As noted above, all features were correctly identified but this required that the robot was fully into the feature. More work needs to be done in this area to reduce misidentifications. This approach made use of an "off-the-shelf' robot with simple sensors. The robot and its sensors were sufficient to correctly identify the features of an indoor environment. This satisfied the overall philosophy of minimizing the use of specialized, non-standard, and potentially expensive equipment (e g., stereo vision systems, laser range finders, radar etc.) 88

PAGE 102

Another area where complexity was minimized was in the software A set of simple rules were combined in various ways to allow identification of the different features. The sonar data collected by the robot was matched against these rules with little or no filtering. The only filtering performed involved averaging six data sets together and using this averaged data set to identify features. This averaging did ha've the effect of reducing some misidentifications. This approach provided the means to identify large-scale features such as corridors and L intersections 5.2 Future Work As in many areas, obtaining answers to questions leads to the generation of more questions. The following areas could be explored as extensions of this study One area for future work identified above is to reduce, and hopefully eliminate, feature misidentification. One approach would disallow the identification of a new feature if the currently identified feature and a possible new feature have equivalent rule matching percentages. Another approach would be to provide a sliding set of sonar data for feature identification instead of the current method of using a fixed number of readings to apply to the rule sets. This larger set of data could be used together or segmented for feature identification. Both methods would allow a feature transition to be verified or proven false by waiting until additional data could be collected. A different way of identifying features could be explored. The concept of comparing what sonar data for a particular type of feature should look like to actual data could be researched via the use of neural nets. A neural net or nets could be trained to recognize the various features and used in place of the current set of rules. 89

PAGE 103

Another area to explore would be to modify the feature identification algorithm to be adaptive. One way would be to make the algorithm rotationally invariant. This would mean that the algorithm could correctly identify the current feature regardless of the orientation of the robot. This could be accomplished by normalizing the sonar data or by performing some form of pattern recognition on the data. Creating a more adaptive algorithm could also be accomplished by modifying the rules such that they would be adaptive to new surroundings, i.e., the rules can correctly identify features in different buildings that have different proportions. As an example, the hallways in one building may be eight feet wide while in another they are twelve feet These features are both corridors but the thresholds that would work in the first building would cause features to be misidentified in the second. Having the rules adjust their thresholds as changes in the environment are detected would be an approach to test. Another means to explore would be to develop rules that are geometrical or shape-based in nature. These rules would be similar to the current CorrRule that makes use of the detected length and width of the feature the robot is traversing. To better use the ability to identify features the software's control of the robot's navigation and maneuvering capabilities will need to be expanded. This would allow the robot to successfully navigate and identify features of an entire floor of a building. Once this capability was added the robot could build a map of its environment based on the features it identifies. This map could be used to identify changes in the environment, such as a door that was closed but is now open, and possibly identify areas where movement has been detected. 90

PAGE 104

A very interesting capability to provide would be the capacity for the robot to accept directions to a location an then successfully navigate to that spot. This could be demonstrated in two modes: with and without a map. In both cases the robot would be provided with directions to the desired location, such as where the nearest copier is located In the case where the robot does not currently have a map the robot could demonstrate its ability to identify features and arrive at the location indicated by the directions. Using a map, a person could indicate the location to gO. to on the map itself. The robot would use current sensor readings and the stored map information to localize itself and navigate to the goal. 5.3 Summary The ability for a robot to perform work is predicated on it being able to have a map of its surroundings and to determine its location in that environment. How to develop a map for a mobile robot has been the focus of much work and has lead to two main methods: grid based and feature-based. This thesis, while following many other studies in the use of ultrasonic sensors, differs from previous work by using this data to identify the large-scale features of the environment such as corridors, intersections, etc. The approach presented here is computationally simple, uses off-the-shelf hardware, and uses the data obtained via the sonar sensors with a minimum of filtering and manipulation. While there are issues with correctly handling the transition from one feature to another and with incorrect range readings this approach does successfully identify the features found in an indoor environment in a new way. Finally, future work for improving this method and for new capabilities were described which highlights the amount of additional research that could be performed in this area. 91

PAGE 105

Appendix A: Computer Program The C++ software program to perform large-scale feature identification is presented in this appendix. The classes are presented in alphabetical order with the header (.h) file first, followed by the implementation ( .cc) file. 92

PAGE 106

#ifudef AUTO PTR H --#define AUTO PTR H -I I I I I I I I I I I I I II I IIIII/ I /Ill I II Ill I Ill I Ill I I I I I I Ill II 11111111 /Ill II Ill I I I II Ill II II I IIIII I II Ill Ill//////// Ill I //Name: auto_ptr.h II Description: II II II II Class which manages memory for an object created on the heap. Implementation if from the Appendix of Scott Meyer's More Effective C++. Meyers credits a 3/30/96 posting to comp.std.c++ by Greg Colvin for this implementation. II Created: 10/15/98 I I Last Mod.: l/l//l/////////l/l///////llll////l/l//ll/l/l//l/l/////llllill/l////ll///////////////////////ll/////l//////// ll//////l////l/1 II Includes III/III/II////// Ill 111111111111111111111 II Forward Decls I /Ill II II I II IIIII/////// /IIIII/ llll/l//l/ll/1 I I Namespaces Ill II II II Ill III/IIIII template class auto _ptr { Ill IIIII I //l/l/lll//l///l/// II Public Functions II I IIIII I Ill/ II I //IIIII IIIII public: I I Constructor. explicit auto_ptr(T* p=O): owner(p), px(p) {} I I Copy constructor auto_ptr(const auto_ptr& ap): owner(ap.owner), px(ap.release()) {} 93

PAGE 107

I I Destructor -auto_ptr() {if( owner) delete px;} II op* T& operator*() const {return *px;} II op-> T* operator->() const {return px;} II Get pointer to managed object T* get() const {return px;} II Give up ownership of managed object T* release() const{owner=O; return px;} ; II Manage a new object I I This function was removed from latest version of auto _ptr II because it is redundant with op= void reset(T *p) {owner=p; px=p;} II op= auto _ptr& operator=( const auto _ptr& ap) { } if((void*)&ap != (void*)this) { } if( owner) { delete px; } owner = ap.owner; px = ap.release(); return *this; I /IIIII I l/l/l/////ll/l/ IIIII/I// I I Protected Functions IIIII Ill I I /Ill Ill I I Ill I IIIII I Ill protected: 94

PAGE 108

I I /II /II //IIIII I I I I I/////// I II Private Functions I I II I I I I I II Ill I II I II II I Ill/ I private: II I Ill I I I I I Ill I I I I I II //IIIII I I I I I Protected Members I II II I Ill/ I II II I ll/l/////1 II I II protected: II II I Ill I I //ll/l/ll/lll//1/1 II Private Members I I I I I I I I II I Ill I I Ill I II Ill/ II private: mutable bool owner; T *px ; }; #endif AUTO PTR H --95

PAGE 109

#ifndefBACKLONGRULE H #define BACKLONGRULE H I I I II I I I II I /Ill I II I I Ill I I I I I I I I Ill I II Ill I I II II II II Ill I II II I Ill I I I I I II I II I I I I I Ill /II I I I I I I I II I /II /Ill I I I I I I I I //Name: BackLongRule h II Description: II Rule that indicates there is a "long" distance in back of the robot. II Created: 12/1198 II Last Mod.: ///////////l///l///////////////l///////l/////////l///l/11///////////////ll//l//l/1/l//l/////l//////l//l///// //////////////// II Includes Ill /Ill IIIII//// #include "Rulelmpl.h" IIIII//////////// /////// I I Forward Decls Ill I Ill IIIII/ I IIIII I I I II II I I I I I Ill/ I I I I I I I I I I II Namespaces I I I I I I I I II I /Ill I I I I II using namespace std; I I Base class class BackLongRule : public Rulelmpl { Ill I 1/////// II II I I II I II I II I I II Public Functions I Ill I I I /Ill II I I I I I II II I I II I I public: I I Constructor BackLongRule(long theThreshold); I I Destructor -BackLongRuleO; I I Determine if data matches virtual bool match(const vector & theData) const ; 96

PAGE 110

I I I I I I Ill I II Ill Ill I I /II IIIII I Ill II Protected Functions II II IIIII// I I I I I I II II Ill I I I I Ill/ protected: ll//l/l///////l//l/l//ll//l/ II Private Functions IIIII I I I I I I II ll////ll//l/l// private: I II I Ill I 111111 I I Ill/ II I I I I I /Ill I I Protected Members Ill Ill I II I I I I I I II I II I II I II I I I I I protected: ll/l/l////ll/l//l/l//////l// II Private Members I I I I II Ill I I I I II IIIII/ IIIII II private: long itsThreshold; }; #endif I I I I I I I II I I I /Ill II II I I II I I I I I I II I I I 111111 I I II Ill I I /Ill II II I I I II I Ill I I I II II I I I I I I II I I I I I /Ill II I I I IIIII//// II I //Name: I I Description: II II Created: II Last Mod.: BackLongRule.cc Implementation of rule to determine if there is a "long" distance in back of the robot. 12/1198 I I I I I I I I I I I I 1/1//l/1 I I I I II I I I I I I I Ill I II I II I I II I I I I /Ill I I II II I II I I II I I I II II I I I I I I I II I I 1111111 I I I I I II IIIII I I I I //l///lll///l/// II Includes /l/////l////ll// #include "BackLongRule.h" /ll//ll/11//// II Statics ll///l///l//// 97

PAGE 111

1111111111111111111111111111 II Public Functions I I I I I I I I 11111111111111111111 BackLongRule: :BackLongRule(long the Threshold) { itsThreshold = theThreshold; } BackLongRule::-BackLongRule() { } bool BackLongRule::match(const vector &theData) const { return (getBack(theData) > itsThreshold)? true: false; } I I I IIIII Ill I I I I I I I I II I Ill I I I I II I I I Protected Functions I I I I 1111111111111111111111111111 I II I I I I I I 1111111111111111111 I I Private Functions II I I 111111111111111111111111 98

PAGE 112

#ifudef CONFIG H #define CONFIG H ////l///l/l///ll/l/ll///l///l////ll/lll/l/l//ll//l///l////////ll/lll///l/////l//ll/l/l/l/l///lll/111//1//ll/ //Name: Config.h I I Description: II Class for providing configuration info to robot (via a flat file). II Created: 12/27/98 II Last Mod.: I Ill I Ill II I I I II I I I I I I I I Ill Ill I II I I I I I I I I I I II I I I I I I I I I I II I I II I I I II I I I I I I I I I I I I II I II I I I I I I I I I I I I II I II I II I II I II Ill I 111111111 Ill II Includes I Ill I II II II I I II I #include II I II 111/11 I 11111111 II II II Forward Decls Ill/ I 1///1111 IIIII/I/ II I class ifstream; II II Ill/ II l//////11/ I II Namespaces II I I I I II I I II Ill I II Ill using namespace std; class Config { I I I II I II I Ill I I II I I II /1111111 I I Public Functions II I Ill /Ill II I I II I I II I Ill/ Ill public: I I Constructor Config(ifstream &in); I I Destructor virtual -Config(); I I Get a string from the input stream virtual string getString(); 99

PAGE 113

II Get a float from the input stream virtual float getFloat(); II Get an integer from the input stream virtual int getlnt(); II Get a long from the input stream virtual long getLong(); I I I II ll////l/////l/l/////////// I I Protected Functions I I I /l///////l/l/l/ll/l///l//l// protected: ///ll/ll/ll////ll////ll/l/// I I Private Functions I I I II l/l////l//l//l/1 //IIIII private: II I II I ll/l////l//ll/l/l/////// I I Protected Members I II Ill/ I II IIIII/ Ill II Ill II Ill/ protected: I I I Ill l////////l///ll////// II Private Members ll/l/////l/l/l/l//l/l/l/111 private: ifstream&itslnput; }; #endif 100

PAGE 114

ll///l///ll/ll///l/ll/////l///l/l/l////////l//ll/l///ll/l/l///////////////l//lllll//l/ll///////////l//l///// II Name: Config.cc II Description: Implementation for obtaining robot config data II Created: 12/27/98 II Last Mod.: //ll//ll/l/l//l///////l/ll///////lll////l////l///ll//////l////l////lll/l//////////l///////ll/l/ll/ll////ll/1 /ll/ll/l////ll// II Includes ////IIIII/////// #include "Config.h" #include #include #include "ConfigError.h" l/ll////l//ll/ II Statics /IIIII//////// lll/l//l//ll/////l///llll/11 II Public Functions ///l/l/l////l/11/l////////// Config::Config(ifstream &in) : itslnput(in) { } Config::....ConfigQ { } string Config::getStringQ { bool haveLine = false ; char data[200); II Get string from file while(!haveLine && itslnput.getline(data 200)) { I I Check to see if line is commented out if(data[O] != '#') { 101

PAGE 115

} haveLine = true; } } //while if(!haveLine) { throw ConfigError("End of File"); } II Create string with data read in return string(data); float Config:':getFloat() { return static_ cast< float>( atof(getString().c str()) ); } int Config::getfut() { return atoi(getString().c str() ); } long Config::getLong() { return atol(getString().c strO); } II I Ill I Ill /1//l/l//ll/l/l//ll// I I I Protected Functions II I II I I /Ill l///l//l/////l/l///l/ I I I Ill /111/lll/1//l/11111/11 II Private Functions II Ill I II I I II I I I I I I I I I I I I I I II 102

PAGE 116

#ifndef CONFIGERROR H #define CONFIGERROR_H I II II I II I II I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I II I I I I I I I I I I II II I I I I I I I I I I I I I 111!111 Ill II I I II I I I I II I IIIII/ I I I Ill II Name: ConfigError.h II Description: Derived exception class which indicates a problem I I associated with reading configuration data II Created: 12/27/98 II Last Mod.: I I IIIII/ I Ill Ill I I I I II I I II Ill/ II I I II I I IIIII Ill Ill II I I II I II IIIII II I I I Ill/ I Ill I I I I I Ill II I I I I I I II I I I I Ill II I I I I I I I IIIII Ill Ill !Ill II Includes Ill/ I II II I I I II I I #include #include I II /l/ll//l//l// //IIIII I II Forward Decls I I Ill II II Ill/ Ill Ill/ Ill/ I I Ill I ll///l///l/ll// I I Namespaces I /1///////l///l//// II using namespace std; class ConfigError : public exception { I I I II I I II Ill/ /IIIII/ Ill/ Ill/ II Public Functions I I /!IIIII llllll//ll//lllllll public: I I Constructor ConfigError( const string &verbage ); I I Destructor virtual ....ConfigError(); I I Description of the exception virtual string& what() throw(); 103

PAGE 117

I I I I I I I I I I II I /Ill /Ill II I I I I I I I I I I Protected Functions I I I Ill IIIII/// II II /Ill /IIIII/// protected: ll/ll////l//////l////////l/1 II Private Functions I I I II I I I Ill /11///////// IIIII private: I I I I II 1/////ll/////// IIIII Ill/ I I Protected Members I I I I I II /11/l//l/1////1/ll/l/// protected: I I I II I II //l///l//l// IIIII/I II Private Members I I I I I I //IIIII/ II /IIIII// Ill private: string itsMessage; }; #endif I Ill I I I I I II I I II I Ill I I I I I I I I I I I I I II II Ill I I I I I I I II I I I I I I II /IIIII I I I I I I /Ill I I I Ill I I Ill I II I II I II 11111111 I IIIII I I II Name: ConfigError.cc I I Description: Implementation for the ConfigError exception. II Created: 12/27/98 II Last Mod.: I I I I I I I I I Ill/ II I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I II II I I I /IIIII/I I I I I I Ill I I I I I Ill I I Ill Ill I I I II Ill /IIIII I I I I Ill I I I I I I I I Ill/ IIIII II Includes I I II 11/1/ll/1 Ill #include "ConfigError.h" /ll/ll//l//111 II Statics /II/II/IIIII/I 104

PAGE 118

lll///llllllll/llll//l/llll/ I I Public Functions IIIII/III/II IIIII/IIIII/ II I I ConfigError: :ConfigError( conststring &verb age) : itsMessage( verbage) { } ConfigError: :-ConfigError() { } string& ConfigError::what() throw() { return itsMessage ; } IIIII/II/IIIII/ IIIII 1/ll/llll/1/ II Protected Functions /Ill I lllllllllllllll/l//ll/11/11 lllllllllll/lllllll/1/llllll II Private Functions I IIIII I lllllllllllllllllll/1 105

PAGE 119

#ifndef CORRRULE H #define CORRRULE H I Ill II Ill I Ill I I I I I I I I I I II I I I I I I II II I I Ill I II I Ill/ I II I I I I I II II I I I II I I I I I I I I I I I II II I I II I II I I I I I II I I I I I II II I I I I I //Name: CorrRule.h I I Description: Rule to check if an area is longer by a set ratio (the I I aspect ratio) than it is wide. II Created: 11/28198 II Last Mod.: I I I I I I I I I II I I I I I I I I I I I I I I I I I II I I I I I II II I I I I I I I I I I I II II I I I I I II I I II II I I I I I I I I I II II I IIIII I I I I II I II I II I /Ill II I II 1/111/ll//1///// II Includes ///l/////l/l//// #include "Rulelmpl.h" II Ill I ll//l////l/l//ll// II Forward Decls 1/ll///111/llll//l/l//// II llll/ll/l/1/lll/ll/ II Namespaces /l/11/lll/////1 ill/// using namespace std; II Base class class CorrRule : public Rulelmpl { I I I I II I Ill /IIIII/ /IIIII///// II Public Functions I I I I IIIII/ IIIII/ /lll/l/1/111 public: I I Constructor CorrRule(long theAspectRatio ); I I Destructor -CorrRule(); II Determine if data matches virtual bool match(const vector & theData) const; 106

PAGE 120

I I I II I I I I I Ill I II I I II I II I I I I /Ill I I Protected Functions II I /IIIII// Ill /111111111/1/ Ill/ protected: I II I Ill I /Ill I I I I I I I I I I IIIII I II Private Functions I II Ill I IIIII/I/ Ill Ill IIIII// private: I II I I I I Ill/ I Ill Ill II IIIII//// I I I Protected Members I II I Ill/ Ill Ill Ill/ 111//1111 Ill protected: I II IIIII// Ill IIIII II IIIII// I I Private Members I II II l//////l//l/1 IIIII/I// private: long itsAspectRatio; }; #endif I I I I I I I I I I I I I I I I I II /IIIII I I II I I II I I I II I /Ill II I I Ill I I II I I I I II I II Ill I II I I I I I I I I II I I I II II I I I I I I I II I I I I I I I I I I I I I //Name: CorrRule.cc I I Description: Implementation of a rule to see if a feature is longer II by a ratio than it is wide. II Created: 11128/98 I I Last I Ill I I I I I I I I I II I I I I I I I I I II I I I I I I I I I I I I I I I I I I II I I I I I I II I II I I I I I I I II I I I I I I I I I I I I I I I I I I I II I II I I II I I I I I I I I I I I I I I ll/llll//1///1// II Includes ll/l/ll//l/1/l/ I #include #include "CorrRule.h" /ll/11//l/l//l II Statics /l///ll/ll//// I I Data being matched 107

PAGE 121

I /II/II/III/ /Ill IIIII/ Ill Ill I I Public Ftmctions I II I I I I II I I II /IIIII/ /1111111 CorrRule: :CorrRule(long theAspectRatio) { itsAspectRatio = theAspectRatio; } CorrRule : :-CorrRule() { } bool CorrRule : :match(const vector &theData) const { II Get front & back & add long length= getFront(theData) + getBack(theData); I /Get sides and add long width= getLeft(the_Data) + getRight(theData) ; II Compare length to width mult by aspect ratio & return retum((width*itsAspectRatio)
PAGE 122

#ifudefDATAFILTER H #define DATAFILTER_H I I I I I I I II I I I I I I I Ill I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I I I I I II I I I I I I I II I II I II I II I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I //Name: DataFilter.h I I Description: II II II Created: DataFilter will be an abstract base class for filters that can be used by Sonarimage to filter raw data. Currently is implemented as an initial test. 11/1/98 II Last Mod.: I I Ill I I I I Ill I Ill Ill I II /Ill Ill/ I I II Ill/ I I I I I I I I I II I I I I I II I I I II I I I II II I I I II I I I I II I II I I I I I I I II I I I I I I I I I I I I I I I I I ///l/////////l/1 II Includes I I Ill ll///l///1/ #include I I I I II Ill/ 1//111 I Ill I Ill II Forward Decls I I Ill IIIII 1///1//1/1 /Ill I I I IIIII/I Ill IIIII II I II Namespaces I I Ill IIIII/////////// using namespace std; class DataFilter { I I I I II I Ill /Ill I II I I II I II Ill/ II Public Functions II I 111111//11 /11//1//ll//l// public: I I Constructors DataFilter(); I I Destructor -DataFilter(); long filter(const vector &theRawData); 109

PAGE 123

I I II II I I I I I I I I II I I I I II I I Ill I I I I II Protected Functions I I II Ill I II II I Ill Ill/ /l/ll//11// protected: I Ill Ill/ Ill /1/l//ll/lll/1111 II Private Functions I Ill Ill/ I II I IIIII/ l/////1/// private: I I I I I I II I Ill II I I Ill /IIIII/ /Ill I I Protected Members I Ill I Ill/ II II Ill/ Ill 1///1////l protected: I I Ill I /l////l/1//ll /IIIII// I I Private Members I Ill I II II II Ill II IIIII/ IIIII private: }; #end if I I I I II I I I I I II II I I II II II I I I I II II I I I I I I I I I I I I I I I I I I I I I I II I I II I I I I I I I I I I I I I I I I I I I I I I II I I I I II II I I I I I I I I I I I I II I I I //Name: DataFilter.cc II Description: hnplementation for the DataFilter class. II Created: 11/1/98 II Last Mod.: /l/////l/////ll!//ll//l//l///l//ll////ll//l//l//ll//l//lll//l//l///ll//l///l//ll////l/ll///ll/l/l/ll/l/l//// //l///ll//l///1/ II Includes 11/lll/l//l///1/ #include "DataFilter.h" #include II For accumulate l//lllll/ll/1/ II Statics 11/ll/11/1/l// 110

PAGE 124

I I I I II I I II I IIIII/ I I II/////// I I Public Functions I II I I I I I II I I Ill/ Ill/ //Ill/// DataFilter: :DataFilter() { } DataFilter: :-DataFilter() { } long DataFilter::filter(const vector &theRawData) { return (accumulate(theRawData.begirt(),theRawData.end(),O)/theRawData.size()); } II I I I I I I II IIIII//// Ill /IIIII/I// I I Protected Functions I I I I I I I I II I I I I I I II II II I /Ill IIIII I I I II IIIII ll///////ll/111111 I I Private Functions I Ill /IIIII Ill /1//1////////// Ill

PAGE 125

#ifndef ENVFEA TURE H #define ENVFEA TURE H /l//l//////l//l///l////ll///l/ll/l///ll////l//l///l///l//l//ll//////l/l/////////////////ll////l/ll/l//////l/ //Name: EnvFeature.h II Description: Base class (eventually) for environmental features that I I the robot will detect and add to a map of the I I environment. II Created: 11115/98 II Last Mod.: 12129/98 II Added list ofSonarimages I I II II I II I I I I I I I Ill/ I II I /Ill I I I I I I I I I I II II I I I I II I I I I II II I I /IIIII Ill I I I I I I 11 I II II II I I I I I I I I I I I I II I I II I I I I I I I I Ill Ill I 111111111 II Includes IIIII/ Ill/ II II I I #include II For type #include I I Container for Sonar Images I Ill/ I I I II I II I I I I I I I I I I I I I Forward Decls I /Ill I II II I I II I I I Ill/ I II class ostream; class Sonarlmage; I /Ill I I I II II I I I II I I I I II Namespaces I II II I I I I I I I I II I I I I I I using namespace std; class EnvFeature { IIIII/ II IIIII// I I I II//////// II Public Functions I Ill/ I I I II II II I I I I I I I I I I I I II public: I I Constructors EnvFeature(); EnvFeature(const string &theType); 112

PAGE 126

EnvFeature(const EnvFeature &orig); I I Destructor -EnvFeature(); II Add Sonarimage to feature void add(const Sonarimage* image); I I Get Sonar Images canst list& getimages() canst; II Accessor for feature type canst string& getType() canst; II Set feature type void setType( canst string& type); I I Summary of feature type, start, end char* summary() const; I I Convert feature to string char* toString() canst; II op= EnvFeature& operator=( canst EnvFeature &orig); /lop= bool operatm (const EnvFeature &other); II op<< friend ostream& operator<<(ostream &os const EnvFeature &feature); I I I I I I I I II II IIIII /IIIII IIIII Ill I I Protected Functions I I Ill ll//l//1///l /IIIII II Ill Ill protected: /ll//////l////l//ll///l/l/l/ II Private Functions I I I I I I II III/IIIII/I/ //IIIII/ private: 113

PAGE 127

II II II II I I Ill I Ill///////// Ill I II Protected Members II II Ill I I Ill/ IIIII/ /IIIII II Ill protected: II/II/I///////// IIIII/I/// I I I Private Members I I I I II I I I I I I I I I I I I II I II II I I private: I I Type of feature string itsType; I I Sonar Image container list itslmages; }; inline canst list& EnvFeature::getlmages() canst { return itslmages; } inline canst string& EnvFeature::getType() canst { return itsType; } inline void EnvFeature::setType(const string &type) { itsType = type; } #endif 114

PAGE 128

I II Ill I II I I I I I I I I I II I I Ill/ I II I I I I I II II I II I I I I I I I I I I I /Ill I I II I I I I I I I I I I I I I I I I I I I I I I II I I I II I II II II I I I I I I II Ill I II Name: EnvFeature.cc II Description: Implementation file for EnvFeature class. II Created: 11115/98 II Last Mod.: 12/29/98 II Added add, setType, ector, op=, op<< II 115199 I I Added summary & toString I I I I I I I I I I I I I I I I I I I I II II I I I I I I I I II I I I I I I II II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I II I I I I I /////III/I////// II Includes I I I I I I I II 1////// #include "EnvFeature.h" #include #include #include "Sonarlmage.h" ll//////l///ll II Statics l/ll//l/////l/ I Ill/ Ill /l////1/ll////////// I I Public Functions I I I I I l//ll///////l///l/l///1 EnvF eature: :EnvF eature() { } EnvFeature::EnvFeature(const string &theType) : itsType(theType) { } EnvFeature::EnvFeature(const EnvFeature &orig): itsType(orig.itsType), itslmages( orig.itslmages) { } EnvF eature: eature() { } 115

PAGE 129

void EnvFeature::add(const Sonarlmage* image) { itslmages.push _back( image); } char* EnvFeature::summary() const { } ostrstream oss; oss << "Feature Type: << itsType << endl; if(!itsimages.empty()) { } const Sonarlmage* start = itslmages.front(); const Sonarlmage* end = itslmages.back(); oss <<"Start: "<< start->getlmage().getXPosit(); oss << End: << end->getlmage().getXPosit(); oss << endl; oss << '\0'; return oss.str(); char* EnvFeature::toString() const { ostrstream oss; oss << "Feature Type: << itsType << endl; if(! itslmages.empty()) { } const Sonarimage* start = itslmages.front(); const Sonarlmage* end = itslmages.back(); oss <<"Start:"<< start->getlmage().getXPosit(); oss << End: << end->getlmage().getXPosit(); oss << endl; oss << "Sonarlmages:" << endl; 116

PAGE 130

list::const_iterator iter; for(iter = itslmages.begin();iter != itslmages.end(); ++iter) { oss << **iter<< endl; } oss<< ':1.0'; return oss.str(); } EnvFeature& EnvFeature::operator=(const EnvFeature& orig) { } if( this!= &orig) { } itsType = orig.itsType; itslmages = orig.itslmages; return *this; bool EnvFeature::operatm (const EnvFeature& other) { return (itsType = other.itsType)? true: false; } ostream& operator<<(ostream &os, const EnvFeature &feature) { os <<"Feature Type: << feature.getType() << endl; if(!feature.getlmages();empty()) { } const Sonarlmage* start = feature.getlmages().front(); const Sonarlmage* end = feature.getlmages().back(); os << "Start: 11 << start->getlmage().getXPosit(); os << 11 End: 11 << end->getimage().getXPosit(); os << endl; os << "Sonarimages:" << endl; 117

PAGE 131

list silist = feature.getlmages(); list::const_iterator iter; for(iter = silist.begin();iter != silist.end(); ++iter) { os << **iter<< endl; } return os; } I I I Ill I I I I I I I I I I I I I I I I I I I I I I I Ill II Protected Functions I I Ill/ IIIII/ ll////////////l/1/ II I I I II I I I II I I I I II I IIIII I IIIII I I Private Functions I II I I I I I I I I I I I I I II I I I I I I I I I I 118

PAGE 132

#ifndefFEATUREPATTERN H #define FEA TUREPATTERN H I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I II //Name: F eaturePattern.h I I Description: Header file for FeaturePattern class. This class II II II II II is an Abstract Base Class for derived FeaturePatterns. A FeaturePattern is how an indoor environmental feature "looks" to the robot's sonars. It is used to identify the type ofEnvFeature to add to the robot's II Created: map. 10/20/98 I I Last Mod.: ///l/l/ll//////l////////l////l////ll/l/l/l/l/ll//l/l///l////////l//l//ll///////l/////////l/////ll///ll//ll// Ill IIIII/ /IIIII/ II Includes I I I II I IIIII// Ill #include I I I I IIIII ///IIIII //IIIII I I Forward Decls I I I I I I I I I I I II IIIII/ IIIII class EnvFeature; class Rule; template class vector; class FeaturePattern { I I I I I 11111/ll/11/11111111111 II Public Functions Ill Ill/ ////IIIII/ Ill/ II IIIII public: I I Constructors FeaturePattern(); FeaturePattern(const EnvFeature &theFeature, const list &theRules); I /Destructor 119

PAGE 133

virtual -F eaturePattern(); II Get Feature that Pattern matches virtual const EnvFeature& getFeature() const = 0; I I Get the number of rules used by the pattern virtual long getNumOfRules() const = 0; I I Get percent matching of pattern to data virtual float getPercentMatch() const = 0; I I Determine if data matches this Pattern virtual bool match(const vector &theData) = 0; I I I I I I I I /Ill II II//////////// II I II Protected Functions I I I I I II I I I I II I I I I I I //l/l///l/1 I protected: ll////////l///l//////////l// I I Private Functions I I I I I IIIII/I I Ill Ill/ IIIII/// private: I I Copy ctor prevent copies from being made F eaturePattern( const F eaturePattern &orig); I I Operator = -prevent assignments FeaturePattern& operator=(const FeaturePattern &orig); I I I I 111/llllll////////l/111111 I I Protected Members I I I 1////// Ill /Ill ///l//l///l/1 protected: I II I IIIII /IIIII II ////IIIII/ II Private Members I I I I II /Ill II I ll////l/////l/ private: } ; #endif 120

PAGE 134

I I I I I I I I I I I I I I I I I I I II II I II I I I I I I II Ill I II I I I Ill I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I I I I I I I I I I /Ill I I I I I I I II I //Name: FeaturePattem.cc I I Description: II Created: Implementation file for FeaturePattem Abstract class. 10/20/98 I I Last Mod.: I I I I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I II Ill I I I I I I I I I I I II I I I I I I II I I I I I I I I I I I II I I II II I I I I I II I I I I I I I I I I I II I I I I II IIIII// I II Includes /Ill 1/l///ll//// #include "FeaturePattem.h" ///////IIIII// II Statics ///l///l////ll Ill II I I I I I I I I Ill//////// Ill I II Public Functions I I Ill 111111111//ll//l/l/l/11 FeaturePattem: :F eaturePattem() { } F eaturePattem: :F eaturePattem( const EnvF eature &theF eature, const list &theRules) { } FeaturePattem: :-FeaturePattem() { } I I II I II Ill IIIII//// //////////Ill I I Protected Functions I I II I II I I I I I I I Ill/ I ll///l//l///1 I I I I I I I II ll////l////l///// II II Private Functions ll//////////l///////l//l/l/1 121

PAGE 135

#ifudefFEATUREPATTERNIMPL H #defme FEATUREPATTERNIMPL H I I I II Ill I I I /Ill IIIII//// I II I I I I I II I I IIIII II I I I I I II I I I I I I I I I I I I I I II II I I I I II I I I I I I II I I I I I I I I I I I I II I I I I I I II I I I I //Name: FeaturePattem!mpl.h I I Description: II II II Header file for FeaturePatternlmpl class. This class is an implementation class for FeaturePattern. It provides methods and variables for use by derived classes. II Created: 1118/98 II Last Mod.: I I I I I I Ill I I II I I I II I Ill/ I I II I I I Ill II I I I I /IIIII I I I II II I II Ill IIIII/ II I I Ill I II II II I II I II I I II II I II I II I I I I I I I I I I I I //l/l//ll/////l/ //Includes Ill I II IIIII IIIII #include "FeaturePattern.h" II Base class #include II Container for Rules #include #include "EnvFeature.h" I I I I II I I IIIII/////////// II Forward Decls I I I I II I II/III//////// Ill class Rule; class FeaturePatternimpl: public FeaturePattern { I I I I I II I I II IIIII II Ill I /IIIII II Public Functions I I I I I I I I I II Ill I I I I II I I I I I I I I public: I I Constructors I /Destructor F eaturePatternlmpl(); F eaturePatternimpl(EnvF eature *theF eature, list *theRules ); F eaturePatternhnpl( const F eaturePatternhnpl &orig); 122

PAGE 136

-FeaturePattemlmplQ; II Get Feature that Pattern matches virtual const EnvFeature& getFeature() const; II Get number of rules used by pattern virtuallong getNumOfR.ules() const; I I Get percent matching of pattern to data virtual float getPercentMatch() const; II Determine if data matches this Pattern virtual bool match(const vector &theData); II Operator= FeaturePattemlmpl& operator=( const FeaturePattemlmpl &orig); Ill/ I I I I II I I I I I Ill///////////// I I Protected Functions I I I 111111111/lll/ll/1/lllllll/l protected: const list& getRules() const; void setPercentMatch(float theMatch); Ill// II Ill I I I I Ill I I I I I I I I I I I II Private Functions /l/l/ll//l////////////ll/l// private: /Ill I I I I II I I I I I I Ill/////////// II Protected Members Ill I II I I I I I I I II I I I III/II/II/// protected: II I I I I I I I II II I I III/II/I//// I I Private Members IIIII I ll///l//////ll/l/l/// private: float itsPercentMatch; auto _ptr itsFeature; 123

PAGE 137

auto _ptr > itsRules; }; #endif I I I I I I I I I I I I I II I I I I II I I I I I I I I II I I I I I I I I I I I II I I I I I I I I I I II I I I I I II II I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I II Name:. FeaturePattemlmpl.cc II Description: Implementation for FeaturePattemhnpl II Created: 1118/98 II Last Mod.: /ll/////l////l/l///l///ll/lll///////l//l/////l//ll/ll//ll////ll/l/ll!l//////ll//////ll/l//l/l//////ll//l//l/ I 11111111111 /Ill II Includes I IIIII IIIII// Ill #include #include #include "FeaturePatternimpl.h" #include "Rule.h" #include //////l/l/1111 II Statics /IIIII/I////// I II II I I I IIIII/ I /////III/I/// I I Public Functions Ill II I IIIII// ll////////l//1/ F eaturePattemimpl: :F eaturePattemlmpl() : itsF eature(new EnvF eature() ), itsRules(new list) { itsPercentMatch = 0 ; } FeaturePattemimpl: : F eaturePattemlmp1(EnvFeature *theF eature, list *theRules) { i tsF eature.reset( theF eature ); itsRules.reset(theRules ); itsPercentMatch = 0; 124

PAGE 138

} F eaturePatternhnpl: :FeaturePatternhnpl( const F eaturePatternhnpl &orig) { } itsFeature = orig.itsFeature; itsRules = orig.itsRules; itsPercentMatch = orig.itsPercentMatch; FeaturePatternhnpl::-FeaturePatternhnpl() { cout << "FPI dtor" << endl; } const EnvFeature& FeaturePatternhnpl::getFeature() const { return *itsFeature; } long FeaturePatternhnpl::getNumOfR.ules() const { return (*itsRules ).size(); } float FeaturePatternhnpl::getPercentMatch() const { return itsPercentMatch; } bool F eaturePatternlmpl: :match( const vector &theData) { list: :const_iterator iter; long matches = 0; I I Check to see if data matches any rules for(iter = (*itsRules).begin(); iter!= (*itsRules).end(); ++iter) { Rule *rule= *iter; if(rule->match( the Data)) { ++matches; } 125

PAGE 139

} I I Calculate the percentage of rules matched and return true/false itsPercentMatch = ((float)matches/(*itsRules) size()) 100; return (matches> 0)? true: false; } F eaturePatternimpl& F eaturePatternimpl: :operator=( const F eaturePatternimpl &orig) { } if( this != &orig) { itsPercentMatch = itsFeature = orig.itsFeature; itsRules = orig.itsRules; } return *this; II Ill I I II I /IIIII// I II I I I I I I I Ill/ II Protected Functions I I I II I I I I I I I I II I I I I I I I II I I I I II I I const list& FeaturePatternimpl::getRules() const { return *itsRules; } void FeaturePatternJmpl::setPercentMatch(float theMatch) { itsPercentMatch = theMatch; } II I I I I I I I I I II I I I II I II I I I I I I I I I Private Functions IIIII// Ill /IIIII/ Ill/ /Ill Ill 126

PAGE 140

#ifudefFILELOG _H #define FILELOG H I I I I I I I I I I I I I II I I I I I I I II I I I I I I I I I I I I I I I I I I I I I I I I II I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I I I I I I I II I I I I II II Name: FileLog.h I I Description: Derived log class which writes information to a flat file. II Created: 12/14/98 //Last Mod: ////l///////////////l/l//ll/lll///////l/l/ll/ll////l///l//l///l////l///////l///l///l////l//l/////l//l/l/l/l/ ///l///l//////// II Includes IIIII/////////// #include "Log.h" #include #include I II Ill/ I Ill/ II Ill/ /IIIII II Forward Decls I Ill I I I I I I I I I I I I II IIIII I class ostre_am; class Sonarlmage; I I I IIIII II Ill /Ill /Ill I I Namespaces I I I I I I I I I II I II Ill /Ill using namespace std; class FileLog : public Log { I I /IIIII/ IIIII// ///l/l//l/1/ I I Public Functions II I I I I I Ill/ I I I I II ll//////ll/ public: I I Constructors FileLog(const char *filename); FileLog(const string &filename); I I Destructor 127

PAGE 141

-FileLog(); I I I Ill/ IIIII//// Ill/ II Ill IIIII/ I I Protected Functions I II II Ill /Ill////////// Ill I I Ill/ protected: I I Ill I I I Ill /Ill II /III/I//// I I I Private Functions I I I I I I I I IIIII Ill IIIII /IIIII/ private: II Write information to file void log(const string &aString); I I I I Ill/ /Ill /Ill l/////l / 1 II Ill I I Protected Members I I I I I I I II IIIII// I 11///l//11/ II protected: ll/////l//ll///////////l/// I I Private Members I I I II I I IIIII/I// Ill/ 11111/1 private: ofstream logfile; }; #end if I I I I II I I I II I I II I I I I I I I I II I I I I I I II II I I I I I I I I I I II I I I II I I I I I I I II I I I I I I I I I I I I I I I II I I I I I I I II I II I I II II I I I I I I I I I I I I II Name: FileLog.cc II Description: Implementation of a Log which writes data to a flat file. II Created: 12114198 II Last Mod.: /////ll///lll/lll/ll/llll/ll///l//llll///l////ll//l/l/ll/lll/l///lllllllllllll/l/ll//lll/l/ll/llllllll/lll/l I I I I I I I I I II I II II II Includes I I I II I I I I I I I Ill/ #include "FileLog.h" #include II Access to time functions #include 128

PAGE 142

//IIIII/////// II Statics 11111111111111 Ill II I I I Ill II I I I Ill/ IIIII/// II Public Functions I Ill I I I I I Ill II I II II I II I II Ill FileLog::FileLog(const char *filename) { logfile.open(filename ); if(!logfile) { cerr <<"Unable to open log file"<< filename<< endl; } } else { } logfile <<"Start log:"<< Timestamp::timestamp() << endl; logfile << endl; FileLog::FileLog(const string& filename) { logfile.open(filename.c str() ); if(!logfile) { cerr <<"Unable to open log file"<< filename<< endl; } else { logfile <<"Start log: "<< Timestamp::timestamp() << endl; logfile << endl; } } FileLog: { logfile << endl; logfile << "End log: << Timestamp::timestamp(); } 129

PAGE 143

I I I II /II I I I Ill I II I II I I I I I I I I I II I I Protected Functions II Ill I I I I I I Ill I II I I I II /Ill IIIII I I II I I II I II II I I I I I II I I I I II II II Private Functions I I I I II II I II I I I II I I I I II I Ill I I void FileLog::log(const string &aStr) { logfile << aStr << endl; } 130

PAGE 144

#ifudefFRONTLONGRULE_H #define FRONTLONGRULE_H ll//l/ll/11//////ll//l//ll////////ll//l/ll/lll///l///lll/l//l//////l/ll/l/l/l/l/ll////ll/l/l/l/1/l//lll//l// //Name: FrontLongRule.h II Description: II Rule that indicates there is a "long" distance in front of the robot. II Created: 12/1/98 II Last Mod.: /l///l///l/l//l/ll//////l/l/lll//l///ll//l//l//l/ll//l/l/lllll///l////l//l//llll/ll//l///l///l////////ll/l// I I I I I Ill II II /Ill II fucludes I II I I I I I IIIII Ill #include "Rulelrnpl.h" ll/l///l/l/1/l///l/1/l// II Forward Decls /11/11/11 ll///1//ll/// II I I Base class class FrontLongRule : public Rulelmpl { I I II I II IIIII I I I I I II I I II II I I I II Public Functions I I I I I I I II I ll////////l/l/l/l/ public: I I Constructor FrontLongRule(long theThreshold); I I Destructor -FrontLongRule(); I I Determine if data matches virtual boo! match(const vector & theData) const; I Ill II /l//l/11/11/1/ IIIII II Ill I I I Protected Functions Ill// //IIIII// /Ill IIIII// II Ill I protected: 131

PAGE 145

I I IIIII II I /ll//l//l///////// I I Private Functions I II I II ll/ll///////l/11111111 private: I Ill I 11111/ll////ll/l/l/11/111 I I Protected Members I I II IIIII/I/ l/l/l/l/l/////l/// protected: ll/////l///l/l//lll//l////l I I Private Members I I II II I II I ll///////////l/11 private: long itsThreshold; }; #end if I I I I I I I II I IIIII I I I II I II Ill I I I I /IIIII Ill 11111111 I /Ill Ill II I I I I I I II II Ill I I I I I I II I I I II II Ill I I I I I I I I I I I II Ill I I I I //Name: FrontLongRule.cc II Description: II Implementation of rule to determine if there is a "long" distance in front of the robot. II Created: 12/1/98 II Last Mod.: I I I I I I I I I I I I I II I I II II II I Ill I II I I I I I Ill/ I I II /!Ill I I II I I I I I II II I I I Ill II I I I I I I I I I I I I Ill I Ill I II I I II II I I IIIII II I I /l////ll///l//// II Includes /l///l/l///l//// #include "FrontLongRule.h" ll////l///l/11 II Statics /////l/lll/111 I I II /III/I////////////////// I I Public Functions II II //l////l/1 III/IIIII/II// FrontLongRule::FrontLongRule(long theThreshold) 132

PAGE 146

{ itsThreshold = theThreshold; } FrontLongRule: :-FrontLongRuleO { } bool FrontLongRule::match(const vector &theData) const { return (getFront(theData) > itsThreshold) ? true : false; } I I II II I I I I I I I I I I I I II I I I Ill I I I I I II Protected Functions Ill//// II I I II II I I I Ill Ill Ill II I I Ill/ I IIIII/// Ill/ /11//ll//1/ II Private Functions I I I Ill/ I I II II II I I I Ill I Ill II I 133

PAGE 147

#ifndefFRONTSHORTRULE H #defme FRONTSHORTRULE H ll/11111//l////////ll/lllll/lllllll/llllllllll/llll//llllllll/1111/llllllllll//llllllllll/lll/lllll/l//lllll //Name: FrontShortRule h II Description: Rule that indicates there is a "short" distance in front II of the robot. II Created: 12117/98 II Last Mod.: IIIII I I I I I II I IIIII/IIIII IIIII/ 1/111/1/ Ill/ I I II I IIIII I I I 11111111 I I II I I I I I I I I /Ill /IIIII I I I I I II I Ill I I I I Ill I I Ill /IIIII/II/IIIII/ II Includes ll/l/ll/1/llllll #include "Rulelmpl.h" lllllll/1/llllllllllllll II Forward Decls 11/ll/l/llllllllllllllll I I Base class class FrontShortRule: public Rulehnpl { I Ill I 11111111111111111111111 II Public Functions /Ill I II Ill I II I lllllllllll/11 public: II Constructor FrontShortRule(long theThreshold); II Destructor II Determine if data matches virtual bool match(const vector & theData) const; IIIII/I/ llllllllllllll/lllll/1/ II Protected Functions IIIII/I I /IIIII/III/III/ I IIIII/I protected: 134

PAGE 148

I I I II lll//l////////////l/l// II Private Functions I I I II I I II I I I I II I II I I II IIIII/ private: I I /Ill II Ill I I II I II II I II I II /Ill I I Protected Members I I I II I I I I II I I I II IIIII/ II/IIIII protected: II I I I I I I I Ill I II Ill II IIIII// I I Private Members II I I I I I I I /Ill Ill Ill//////// private: long itslbreshold; }; #end if I I I I I I I I I I I I I I I I I I Ill Ill II I I II /IIIII I I I I Ill II I I I II I /Ill I I I I /Ill Ill I II I IIIII I II I I Ill/ I I I I I I Ill I I I I I I I I I I I I Ill //Name: FrontShortR.ule.cc II Description: Implementation of rule to determine if there is a "short" II distance in front of the robot. I I Created: 12/17/98 II Last Mod.: I I I I I I I I I I II II IIIII// I I Ill I I I I Ill/ I //////Ill/ Ill II I I I I I Ill I I IIIII I IIIII I I II Ill I Ill I I /II II I II I /Ill I II I I I I II I I ////ll//////l/// II Includes l/l//l/////////l #include "FrontShortR.ule.h" ////l///l//1// II Statics /II/IIIII/I/// I I I Ill IIIII/ IIIII IIIII /Ill I I II Public Functions I I I I II Ill I II I II I I I II I II II I I I FrontShortR.ule::FrontShortRule(long thelbreshold) 135

PAGE 149

{ itsThreshold = theThreshold; } FrontShortRule: : -FrontShortRuleO { } bool FrontShortRule::match(const vector &theData) const { return (getFront(theData) <= itsThreshold)? true: false; } I I I I IIIII/ I I Ill I IIIII I I II IIIII I II Protected Functions I I I I II I II I //Ill I I II I II /Ill I I I II lll/l/////////l///l/l/////// I I Private Functions II I Ill/ /IIIII ll//l/////l/l// 136

PAGE 150

#ifudef LOG _H #defme LOG_H I I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I I I I I I I II I I II I II II I I I I I I I I I I I I I I I I I I I I I I II II I I I I I I I II I I I I I I I I //Name: II Description: II II Created: I I Last Mod.: Log.h Base class for Log Hierarchy. Provides interface for derived log classes that are used by the robot. 12/13/98 I I I I I I I I I I I II I I I I I I I I I II I I I I I I I I I II II I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I II II I II I I I I I I I I II I I I I I I I I II I I I I I I I I I I Ill/ II I I I II Includes ///l//l//l///ll/ #include #include "Observer.h" I I I I I I II I /IIIII II Ill Ill/ II Forward Decls I I I I I I I I II I II II I I I I Ill/ I I II l/l////l/lll/1/l/1 I I Namespaces IIIII/ Ill ///IIIII Ill/ using namespace std; class Log : public Observer { I I I I Ill ///////!11 IIIII/ IIIII I I Public Functions I I I II I I I 11//l/1 /!Ill /1111111 public: Log(); virtual -Log(); virtual void update( const string &aString); 137

PAGE 151

1111111111111111111111111111111 I I Protected Functions I I I III/IIIII/// /IIIII///////// I protected: virtual void log( const string &aString) = 0; I I I 11111111 Ill/ 1111111111111 II Private Functions 1111111111111111111111111111 private: Ill 111111111////// /IIIII /IIIII II Protected Members I I I IIIII/ 111/l///lll//l////l// protected: lllllllllllll//l///lll/// II II Private Members II II/III/II/II/II/IIIII/II/ private: }; #endif I I I II I II I I I II I I I I II II I Ill I I Ill I I Ill I I I I I I I I I I I I I I I I I Ill I I /Ill Ill I I I 111111111 I 111111111111111 I I I I I I I Ill I Ill I I II Name: Log.cc II Description: Implementation for Log class. II Created: 12/13198 II Last Mod.: I I I I I I I I II I I I I I I I II I I I I I I I I II I Ill I I I II I I I I I I I I I I I II I II II I I I I I I I I I I I Ill I I II I I I I I I I I I I I I I I I I I I I II I I Ill I I I I I I I I lll//ll/11/lll/l II Includes II IIIII/II/III II #include "Log.h" ll///ll//ll/// II Statics ll/l/////ll/11 138

PAGE 152

I II I I I I II I I I I II I Ill I I I I Ill I I II Public Functions I I I I 111/ll///ll//l/////l/l// Log::Log() { } Log::-Log() { } void Log::update(const string &aString) { log( aString); } IIIII ll////l/////l// II ////lll/1 I I Protected Functions I II II ll////l///l//l/// /IIIII/// II II I IIIII ll/////ll/ll////// II Private Functions I I I I Ill/ ///l//1/11 /l/////l/1 139

PAGE 153

#ifudef LOGGER _H #define LOGGER H I I I I II I I I I I I I I I I I I I I I I I I I I I I I I II I II I I I I I I I I I I II I I I I I I I II I I II I II I II I II I I I I I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I I I I I II Name: Logger.h II Description: Class which provides logging facilities for robot. II Created: 12/10/98 II Last Mod.: I I I I I I I I I I I I I I I I II I I I I II I I I II I II I I I I I I I I I I I I I I I I I II I I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I Ill///////////// II Includes I 111111111111111 #include #include II I Ill I II Ill//////// /Ill I I Forward Decls I I I I I Ill I I II I IIIII///// I II IIIII /////////II Ill I I Namespaces I I Ill I I Ill II IIIII//// class Logger : public Subject { I I IIIII IIIII/ /l/////l/l////l I I Public Functions lll/l////l//////l/l/////ll// public: I I Constructor Logger(); I I Destructor -Logger(); II Log message void log( const string &message); I I Log message with timestamp void logt( const string &message); 140

PAGE 154

II I Ill I I I I I I Ill II Ill I I I I IIIII I I I I Protected Functions I I I II I I I I II I I I I I I I I I I I I I I Ill/ II protected: I II I I I I I I I I l/ll/l///////l/// II Private Functions I Ill/ Ill II////////////////// private: I !Do not allow copying Logger(const Logger &logger); Logger &operator=(const Logger &logger); II I II II I II ll//l//ll///////l/// I I Protected Members I I I II Ill Ill Ill/////////// IIIII protected: I I I Ill Ill Ill///////////// II II Private Members I I /Ill lllllllll/ll/l/l//1// private: }; #endif /l///l/////////l///l/////l///l/l/l/ll/l//////////ll////l////////l/ll!l///l///l//l/l/ll//l/ll//l/l/////////// II Name: Logger.cc II Description: Implementation for logging facility. II Created: 12/10/98 I I Last Mod.: ll////ll//l//ll/l//l//l////////////ll/l/ll/l/l/////ll/l///l//ll////l///l/l/////////ll///////l/////l/lll/l/// l/l///////ll///1 II Includes //ll/l/l//llll// #include #include "Logger .h" //header file 141

PAGE 155

IIIII///////// II Statics l/l/ll///l///l I I I II I II I /Ill/////// Ill Ill II II Public Functions Ill II I II IIIII/I///// II/IIIII Logger: :LoggerO { } Logger: :-LoggerO { } void Logger::log(const string &message) { notify( message); } void Logger::logt(const stririg &message) { string tMsg(Timestamp::timestampO); tMsg+= ": "; tMsg += message; notify(tMsg); } I II I Ill I I I II I II Ill I I I I II I I II Ill I I Protected Functions I II I II I I I I Ill Ill/ I II II I IIIII// I I I II I I II I Ill Ill Ill ll////l/1/ I I Private Functions I II /Ill I I I I II I I I I I I I I I II I Ill 142

PAGE 156

#ifudef MEMORY #define MEMORY I I I I I I I I I I I II I I I I I I I I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I Ill II II I I I I I I I I I //Name: I I Description: II II Created: I I Last Mod.: memory Substitute for Standard C++ Library memory file. #include's auto_ptr.h 10/15/98 II I I II I I II I II I I I I I I I II II I II I I I I I I I II I II I I I I I II I I I I I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I II II I I I I I I I I I /l///////l/l//// II Includes II I II I I I I II II I II #include "auto.._ptr.h" #endif 143

PAGE 157

#ifndef OBSERVER H #define OBSERVER H I I I I I I I I I I I I I I I I I I II I I I I I II I I I I I I I I I I I I I II II I I I I I II I I I II I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I I II I I //Name: Observer.h II Description: Class which monitors another class (the Subject). Based II on the Observer pattern in Design Patterns. II Created: 10/12/98 II Last Mod.: I II I I I I I I I II I II I I I I I I II I I I I I I I I I I I I I I I I I I I I II II I I I I I I I I I I I I I I I I I I I I I I I I I II I I II I I I I I I I I I I I I I II I I I I I I I I I I I II I I ////IIIII/////// II Includes //l////l//1///// #include 111///// IIIII /II/II/I/// II Forward Decls Ill IIIII ll///////ll/ll/1 class Subject; I I I lllllllll/l////l/1 II Namespaces I I I II I I I I I I I I I I I I I IIi using namespace std; class Observer { ///IIIII I IIIII Ill ///////Ill/ I I Public Functions I I Ill ll/l/////l/1 ///IIIII/// public: Observer(); virtual -Observer(); virtual void update( const string &aString) = 0 ; 144

PAGE 158

Ill/ II I I I Ill I I I I I Ill IIIII II /Ill II Protected Functions II I I I II //l///llll/l//////l/l/11 protected: lll//ll////l////////ll/l//// I I Private Functions l/ll/////l/ll/l///l///////ll private: I I I II II Ill/ II I I I Ill I II I I II Ill/ II Protected Members II II I I I I I I I II Ill IIIII II II II I II protected: I I Ill II /Ill I I I II Ill I I /Ill II II Private Members II II I I I ll///l/////1/ll///// private: }; #end if ll///l////l/l//////l//l//ll///ll/////l//////ll/ll////lll//////////////l//ll//lll/l////////////l//lll/////ll/ II Name: Observer.cc I I Description: Implementation for Observer class. II Created: 10/13/98 II Last Mod.: /l/ll//l///////ll///////l///l///l/l///l///////ll/ll///ll//l/l//////////l//ll/l///ll///ll///////l////l///l/// /////lll/////l/1 II Includes ll/////ll/lll/// #include "Observer.h" ///l/l///l///1 II Statics ll//1///l//l/1 I I I I I I I I I Ill II I I I I I I II I Ill II II Public Functions II I II I IIIII Ill////////////// 145

PAGE 159

Observer: :Observer() { } Observer: { } II I I I I llll//l///////l//// /IIIII I I Protected Functions II II II IIIII ///Ill///////// IIIII I I I I I I I I I I I I I II II Ill//////// I I Private Functions I I I Ill I I I I I I I Ill IIIII/ /IIIII 146

PAGE 160

#ifudef ROBOT_ H #define ROBOT_ H II I I I I I I I II I I I II Ill I I I I I I I I I I I Ill Ill Ill I I I I I I I II I I I I I I I I I I Ill I Ill I I I I I I I II I I I I I I I I I II II I I I I I II I I I I II I I I I I I I I //Name: Robot.h II Description: This class provides the interface for the Robot class. This II class was originally part of a class project for CSC 5804 Robotics II at the University of Colorado, Denver, Spring 1998 This class has been I I modified as part of my Master's thesis at the University of Colorado, II Denver, Fall1998-Spring 1999. This class is responsible for controlling II the movement of the robot and the collection and processing of sonar data I I to identify indoor features. II Created: 3/7/98 II Last Mod: 11/19/98 II Added logger and logs I I Modified explore to collect sonar data and identify II features I I I I I I I I I I I I Ill I I I I I I II I I II II I I II I I I I I I /IIIII I I I II I I I I II II I I II I I I Ill/ II I I I I I I I I I I I I I II I II II I Ill II I I I I I I I I I I I l///l/l/ll/l/l// II Includes 1///111/ll////// #include #include #include II For holding EnvFeatures I I For auto _ptr #include #include "Nclient.h" II For holding FeaturePattems and SonarData I /Header for robot interface library #include #include #include #include #include #include I II I I I II I I I II I I II I II I I I I I I Forward Decls //l//l///l/l//ll/11 IIIII I I I I I II I II II I Ill /Ill I II Namespaces I I I I I I I I I I I I I I I I I I I I I using namespace std; 147

PAGE 161

class Robot { /Ill IIIII I /Ill I I I I I II Ill Ill/ II Public Functions II //Ill II II II I II II II /IIIII II public: I I Constructor Robot( const char* filename); II Destructor -Robot(); I I Determine direction to move to avoid an obstacle int avoid(); II Check for obstacles void checkForObstacles(); II Explore void explore(); II Initialize boo! init(); II Check to see if there is an obstacle boo! obstacle() const {return myObstacle;} II Log collected data void report() const; II Command which wraps/corrects robot's vrn command void scoutVm(short theTrans, short theRot); II Shutdown void shutdown(); I II I I I I I I I I I I I I I I I I Ill I I II II Ill I I Protected Functions I I II II I II I I I I I I I I I I I I I I I I Ill/ I I protected: 148

PAGE 162

I I I Ill/ I II I Ill///////////// I I I Private Functions I I I I I II I ll//////l/////////// private: II Make sure we don't hit a wall void checkRange(long range); II Make sure we don't hit a wall-version 2 void checkRange(long *state); II Get variable parameters void getConfig(); I I Get X position of Robot long getXPosit(int sensorld); II Get Yposition of Robot long getYPosit(int sensorld); II Create Sonarirnages for self-test void initirnages(); II Create FeaturePattems to be used void initPattems(); II Initialize the sensor suite bool initSensors(); II Convert measurements in tenths of inches to inches float tenthsToinches(long tenths) const {return (tenths/10.0);} I I I II IIIII ///////////////Ill II I I Protected Members II I II II I I I I I Ill/ Ill Ill I IIIII// protected: I II I I Ill I /11////ll////////l I I Private Members II I II Ill Ill//////////////// private: unsigned int myTransSpeed; //Translational speed .1 in/sec 149

PAGE 163

int int const int const int int int int int const long long long long bool bool int int long int string PosData DataFilter EnvFeature* myDistance; //Travel dist (in .1 inches) mySonarFireRate; //Firing rate for sonars my Timeout = 0; I /Robot has no timeout myTumAngle = 450; II Angle to turn (in 0.1 deg) myCorrAspectRatio; myReadingsPerimage; myFWComerMatches; myReadingDistance; II 0.1 inches to take a reading at myld = 1; /liD ofRobot to server myMinRangeToObs; myShortThreshold; myLongThreshold; myObstacle; avoiding; //new my Direction ; my Ret; robotPosit; mySonarFiringOrder[ 16]; myObstacleSensed; myPosData; myDataFilter; myCurrentFeature; 150

PAGE 164

auto _ptr myConfig; auto _ptr myScreenLogger; auto _ptr myFeatureLogger; auto _ptr mySonarLogger; auto _ptr myDebugLogger; vector myPatterns; list my Features; vector mySonarlmages; //For self test }; #endifROBOT H I I I I I I I I I I I I I I II I I I I I I I I I I II I Ill I I IIIII /IIIII/ II Ill II I I II I II I I I I I I I I I I I I I I I /IIIII/ I II II I I I I I IIIII I Ill II I I I II //Name: Robot.cc II Description: Implementation for Robot class. This class was II originally-part of a class project for CSC 5804 Robotics II at the University of Colorado, Denver, Spring 1998. It II has been modified as part of my Master's thesis at the II University of Colorado, Denver, Fall 1998Spring 1999. I I This class is responsible for controlling the movement II ofthe robot and the collection and processing of sonar I I data to identify indoor features. //Created: 3/7/98 //Last Mod: 3/26/98 II Added functionality to avoid obstacles in the path of II the robot II 3/31/98 II Added avoid, checkForObstacles, & modified explore II 11/19/98 II Added use oflogger and logs II Modified explore to collect sonar data and added I I functionality to ID large-scale indoor features. I I I I I I I I I I I II I I I II II I I Ill/ /Ill I I I I Ill II I Ill II II I I I II I I I I II I I II I II I I I I I I II I I I I I I I I I II I I I I I Ill/If/ Ill/ I I II I I I I I I I II II I Ill IIIII II Includes I I I I I I I II I IIIII/ 151

PAGE 165

#include "Robot.h" I /Header for Robot #include #include #include #include #include #include. #include # i nclude #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include //IIIII/I///// II Statics l//////ll//l/1 IIIII/ I Ill /IIIII/ II Ill Ill Ill I I Public Functions 1/11/111 II II Ill II /IIIII/I/// Robot::Robot(const char* filename): myObstacleSensed("") myCurrentF eature(O) { II Provide logging to screen 152

PAGE 166

myScreenLogger reset(new Logger); auto_ptr sl(new ScreenLog); myScreenLogger->attach( sl); II Create logger and file for debug data myDebugLogger.reset(new Logger); string debugFile(filename ); debugFile += debug"; auto _ptr debugLog( new FileLog( debugFile) ) ; myDebugLogger->attach(debugLog); II Create logger and file for sonar data mySonarLogger.reset(new Logger); string sonarFile(filename); sonarFile +=". sonar" ; auto _ptr sonar Log( new FileLog( sonarFile) ); mySonarLogger->attach( sonar Log); II Create logger and file for features myFeatureLogger.reset(new Logger); string featureFile(filename ) ; featureFile += ".feature"; auto _ptr featureLog(new FileLog(featureFile )); myFeatureLogger->attach( featureLog) ; II Read configuration parameters getConfigO; II Create feature patterns initPatternsO; II Initialize obstacle avoidance variables avoiding = false; myDirection = 0; myObstacle = false; myRet =0; robotPosit = 0; II For a "self-test" of the pattern matching code call initlmages #ifdef SELFfEST initlmagesO #end if myScreenLogger->log("Robot Created"); 153

PAGE 167

} I /Destructor Robot::-RobotQ { } II Clean up-delete myCurrentFeature to prevent leak delete myCurrentFeature; II Log destruction of robot myScreenLogger->log("Robot destructed"); int Robot::avoidQ { //Direction to return int my Direction = -1; int obslnt = atoi(myObstacleSensed.c_str()); //Determine which case is being dealt with switch(obslnt) { case 10000: case 11000: case 1000: case 11100: case 1100: case 1010: case 1110: case 10100: case 10110: case 11001: case 11010: case 11110: my Direction = 450; myDebugLogger->logt("Tuming Right"); break; case 1: case 11: case 10: case 111: case 101: case 110: case lObi: case 10010: case 1 00 11 : case 11 0 1 : case 1111 : myDirection = 3150; myDebugLogger->logt("Tuming Left"); break; case 100: myDirection = 450; myDebugLogger->logt("Obstacle directly ahead. Turning Right."); break; case 0: case 10001: myDirection = 0; break; case 11111 : case 1 0 1 0 1 : case 1 0 111 : case 11 0 11 : case 111 0 1 : myObstacle = true; 154

PAGE 168

} } break ; default: myDebugLogger->logt("Unrecognized obstacle pattern") ; break ; return myDirection ; void Robot::checkForObstacles() { } myObstacleSensed = '"'; for(int i=2;i>=O;i-) if(State[STATE_SONAR_2i] < myMinRangeToObs) myObstacleSensed += "1 "; else myObstacleSensed += "0"; for(int i=1;i>=O;i--) if(State[STATE_SONAR_15i] < myMinRangeToObs) myObstacleSensed += "1"; else myObstacleSensed += "0"; //Explore void Robot: : explore() { myDebugLogger->logt("Entering explore") ; II Collect initial sensor reading gs() ; checkForObstacles() ; I I Keep track of how far robot has moved long distTravelled = 0; II Keep track of data collected to produce Sonarlmage int amountOfData = 0; II Container for raw sonar data 155

PAGE 169

vector dataToFilter(myReadingsPerlmage ); while((State[STATE_CONF _X]< myDistance) && (obstacle()= false)) { I I Collect sensor readings gs(); long currX = State[STATE_CONF _X]; if((currX-distTravelled) >= myReadingDistance) { long currY== //get sonar data vector rawData(16); for(int i=O; i<16 ; ++i) { rawData[i] = State[17+i]; } Sonar Data sonarData(tenthsToinches( tenthsToinches( currY), rawData) ; I I Record all Sonar Data collected mySonarLogger->logt("SonarData collected"); mySonarLogger->log(sonarData.toString()); dataToFilter[amountOfData] = sonarData; amountOIData++; II If enough data has been collected, filter and II create Sonarlmage if(amountOfData = myReadingsPerlmage) { Sonarlmage *image= new Sonarlmage(myDataFilter,dataToFilter); I I Record filtered data mySonarLogger->logt("Sonarlmage created"); mySonarLogger->log(image->toString()); 156

PAGE 170

I I Reset data counter amountOIData = 0; I I Keep track of percent match per FPI & which FPI has highest //match float bestPercentMatch = 0.0; FeaturePatternlmpl *bestMatch = 0; II***** Added 1/9/99 ***** vector mpvec; size_t numSortFeatures = 3; II***** Added 1/9/99 ***** //pattern match vector:: i terator iter; for(iter=myPatterns.begin(); iter!=myPatterns.end(); ++iter) { FeaturePatternlmpl *fpi = *iter; II Check to see i .fFPI is the best match for data if( fpi->match( image->getimage().getSonarPings())) { II***** Added 1/9/99 ***** mpvec.push back(Sonarlmage: :matchPair( fpi->getPercentMatch() fpi->getFeature().getType())); II***** Added 1/9/99 ***** if(fpi->getPercentMatch() > bestPercentMatch) { } bestMatch = fpi; I I Set bestPercentMatch to that of FPI bestPercentMatch = fpi->getPercentMatch(); else if(fpi->getPercentMatch() = bestPercentMatch) { II Prefer corr to 4W, 4W toTs if(bestMatch->getFeature().getType() != "corridor") { if(fpi->getFeature().getType() = "fourWay") { bestMatch = fpi; } 157

PAGE 171

I I Otherwise, stay with pattern that was found first } II if feature !=corridor } II else if percent matches are same } II if match } II myPatterns loop if(bestMatch) { II***** Added 119199 ***** sort(mpvec.begin(),mpvec.end(),greater()); if(mpvec.size() > numSortFeatures) { mpvec.erase(mpvec.begin() + 3, mpvec.end()); } image->setMatches(mpvec ); II***** Added 119199 ***** II If first feature, set to current and add Sonarlmage if(!myCurrentF eature) { } myCurrentF eature = new EnvF eature(bestMatch->getF eatureO );. myCurrentF eature->add(image ); I I Log feature myScreenLogger->logt("Initial Feature Detected"); myScreenLogger->log(myCurrentFeature->getType()); myF eatureLogger-> logt(myCurrentF eature->summary() ); mySonarLogger->logt(myCurrentFeature->toString()); II If still in current feature, add Sonarlmage else if(bestMatch->getFeature().getType() = myCurrentFeature->getType()) { myCurrentFeature->add( image); myFeatureLogger->logt(""); myFeatureLogger->log(myCurrentFeature->summary()); mySonarLogger->logt(""); mySonarLogger->log(myCurrentFeature->toString()); 158

PAGE 172

} I I If new feature, save old and set current to new feature II Stop robot to indicate new feature detected else { } I I Conunand robot to stop st(); sleep(5); EnvFeature* old= myCurrentFeature; myFeatures.push _back( old); myCurrentFeature =new EnvFeature(bestMatch->getFeature()); myCurrentF eature->add( image); I I Log feature myScreenLogger->logt("New Feature Detected"); myScreenLogger->log(myCurrentF eature->getType() ); myF eatureLogger ->logt(" "); myFeatureLogger-> log(myCurrentF eature->sununary() ); mySonarLogger->logt(""); mySonarLogger->log(myCurrentF eature->toString() ); } // ifbestmatch else { mySonarLogger->logt("Sonarimage did not match a Pattern") ; mySonarLogger->log(image->toString() ); } I I Prevent leak //delete bestMatch; } // ifarnountOfData = rnyReadingsPerimage //increment distTravelled distTravelled = currX; } I I if delta X >= rnyReadingDistance 159

PAGE 173

checkF orObstacles(); int dir = avoid(); switch(dir) { case -1: break; case 0: if(!avoiding) { } scoutVm(myTransSpeed,O); avoiding = false; else { myDebugLogger->logt("*** Should turn back to original dir ***"); //cout << "Direction to turn is: <<-my Direction<< endl; } avoiding = false; st(); ws(l,l,1,2); scoutVm(O,myRet); scoutV m(myTransSpeed,O); ws(l,l,1,2); break; case 450: if(!avoiding) { } myRet = 3150; avoiding = true; st(); ws(1,1,1,2); scoutVm(0,450); scoutVm(myTransSpeed,O); ws(1,1, 1,2); break; case 3150: 160

PAGE 174

if(! avoiding) { } myRet = -myTumAngle; avoiding = true; st(); ws(l,l,1,2); scoutVm(0,3150); scoutVm(myTransSpeed,O); ws(1 ,1,1,2); break; default: myDebugLogger->logt("Umecognized return from avoid()"); break; } I I switch dir } //while //tell robot to stop st(); ws(l,1,1,2); II Add "last" current feature to feature list if(myCurrentF eature) { myF eatures.push back(myCurrentF eature ); } } //Initialize bool Robot: :init() { myDebugLogger->logt("Initializing robot"); int status = 0; I /Connect to server status = connect_robot(myld); if(!status) { myDebugLogger->log("unable to connect to robot"); } //Zero the Robot location 161

PAGE 175

} int statuszr = zr(); status += statuszr; if(!statuszr) { myDebugLogger->log("unable to zero robot location") ; } I /Set command timeout int statustm = conf_tm(myTimeout) ; status += statustm; if(!statustm) { myDebugLogger->log("unable to set timeout") ; } I /Set the motion parameters int statussp = sp(myTransSpeed,O,O); status += statussp; if(! statussp) { myDebugLogger->log("unable to set motion params"); } //Initialize the sensors-** Note: May not need to do this** #ifdef SNSRINIT bool sensorStatus = initSensors(); if(!sensorStatus) { myDebugLogger->log("unable to init sensors"); } #endif #ifdef SNSRINIT return ((status+ (int)sensorStatus) = 5)? true : false; #else return (status= 4)? true: false; #endif II Log EnvFeatures detected void Robot: :report() const { myScreenLogger->log(" ") ; myScreenLogger ->logt(" ") ; 162

PAGE 176

myFeatureLogger->log(""); myFeatureLogger->logt(""); list: :const_iterator iter; for(iter = myFeatures.begin();iter != myFeatures.end();++iter) { myScreenLogger ->log( (*iter)->summary() ); myFeatureLogger->log((*iter)->toString()); } } I I Command robot to move void Robot::scoutVm(short theTrans, short theRot) { int rot= (int)((float)theRot 377.0 I 3600.0); vm(theTrans +rot, theTrans -rot, 0); } //Shutdown void Robot::shutdown() { myScreenLogger->logt("Shutting Down Robot"); myDebugLogger->logt("Shutting Down Robot"); } I /Disconnect from server disconnect_ robot(myld); I II Ill/////// Ill II II Ill/ Ill IIIII I I Protected Functions I I I I I II I I I I I I I I I II I I I I I I I I I I I II I I I I I I I I II I I I I I I I II I I I I I I I I I I I I I Private Functions I I I I I I I I I I I II I I I I I I I I II I II II I //Check ranges to avoid collisions void Robot::checkRange(long range) { cout << "myObstacle = << myObstacle << endl; 163

PAGE 177

} if(range < rnyMinRangeToObs) { } myObstacle = true; myDebugLogger->logt(""); myDebugLogger->log("******************************************"); rnyDebugLogger->log(" !!!OBSTACLE DETECTED AHEAD!!! "); //Check ranges to avoid collisions void Robot::checkRange(long *s) { } long r = myMinRangeToObs; if((s[17] logt(""); myDebugLogger->log("******************************************"); myDebugLogger->log(" ! !OBSTACLE DETECTED AHEAD!!! "); void Robot::getConfig() { myScreenLogger->log("Reading in configuration data"); myDebugLogger->logt("Reading in configuration data"); II Open config file for reading ifstream ifs("robot.cfg"); II Create Config object Config cfg(ifs); II Get information from file try { myTransSpeed = cfg.getlnt(); ostrstream mts; rnts << "Trans Speed is: << rnyTransSpeed << '\0'; rnyDebugLogger->log(mts.str()); 164

PAGE 178

myDistance = cfg.getlnt(); ostrstream md; md << "Travel Distance is: << myDistance << '\0'; myDebugLogger->log(md.str()); mySonarFireRate = cfg.getlnt(); ostrstream msfr; msfr << "Sonar Firing Rate is: << mySonarFireRate << '\0'; myDebugLogger->log(msfr.str()); myCorrAspectRatio = cfg getlnt(); ostrstream mcar; mcar <<."Corridor Aspect Ratio is:"<< myCorrAspectRatio << '\0'; myDebugLogger->log(mcar str()); myReadingsPerlmage = cfg.getlnt(); ostrstream mrpi; mrpi << "Readings per Sonarlmage is: << myReadingsPerlmage << '\0'; myDebugLogger->log(mrpi.str()); myFWComerMatches = cfg.getlnt(); ostrstream rnfcm; rnfcm << "Comers to Match are: << myFWComerMatches << '\0'; myDebugLogger->log(rnfcm.str()); myMinRangeToObs = cfg.getLong(); ostrstream mmrto; mmrto <<"Min Range to Obstacles is:"<< myMinRangeToObs << '\0'; myDebugLogger->log(mrnrto.str()); myShortThreshold = cfg getLong(); ostrstream rnst; mst <<"Short Threshold is:"<< myShortThreshold << '\0'; myDebugLogger->log(mst.str()); myLongThreshold = cfg.getLong(); ostrstream mlt; mlt << "Long Threshold is: << myLongThreshold << '\0'; myDebugLogger->log(mlt.str()); myReadingDistance = cfg getlnt(); ostrstream mrd; mrd <<"Reading Distance is:"<< myReadingDistance << '\0'; 165

PAGE 179

} myDebugLogger->log(mrd.str()); } catch(ConfigError &ce) { } myDebugLogger->log("Exception caught: EOF of config file, exiting"); myDebugLogger->log(ce.what()); exit(l); catch( ... ) { } myDebugLogger->log("Unexpected exception caught: exiting"); exit( I); long Robot::getXPosit(int sensor!d) { } //convert to inches robotPosit = State[34]/1 0; if(sensorld = 0) { } if(State[STATE_SONAR_O] <= 120) { robotPosit += State[STATE_SONAR_O]; } else robotPosit += 99999; cout << "robotPosit = "<< robotPosit << endl; return robotPosit; long Robot::getYPosit(int sensorld) { } I /Get adjustment due to Robot drift long adjustment= State[35]110; return (sensorld = 4)? State[STATE_SONAR_ 4]-adjustment: State[STATE_SONAR_l2] +adjustment; 166

PAGE 180

void Robot::initlmages() { myDebugLogger->logt("lnitializing Sonarlmages for self test"); II Create Set of Sis I I Create arrays & vectors long corrArray[16] = {226,154,200,44,43,44,209,120,202,106,170,36,32,33,201,150}; vector v1(&corrArray[O],&corrArray[16]); long fwArray[16] = {234,211,200,131,169,132,66,190,222,170,66,186,136,113,223,200}; vector v2(&fwArray[O],&fwArray[16]); //long atArray[16] = {}; //vector v3(&fwArray[O],&fwArray[16]); long utArray[16] = {39,40,70,73,198,191,65,99,207,153,65,126,133,184,72,40}; vector v4( &utArray[O],&utArray[16]); long alcArray[16] = {206,124,180,44,43,44,209,120,202,102,53,76,75,76,180,120}; vector v5(&alcArray[O],&alcArray[16]); long dalcArray[16] = {206,124,180,71,70,80,57,98,197,102,53,76,75,76,180,120}; vector v6( &dalcArray[O],&dalcArray[ 16]); //long ceArray[16] = {}; I /vector v7 ( &dalcArray[O],&dalcArray[ 16]); I /long aeArray[ 16] = {}; //vector v8(&dalcArray[O],&dalcArray[16]); I /long daeArray[ 16] = {}; I /vector v9( &dalcArray[O],&dalcArray[ 16]); long 1Array[16] = {66,71,70,42,41,42,60,91,95,96,51,89,186,203,212,150}; vector v10(&1Array[0],&1Array[16]); I I Create SonarDatas and lists for Sis long x = 45; longy= 90; SonarData sd1(x,y,v1); SonarData sd2(x,y,v2); //SonarData sd3(x,y,v3); 167

PAGE 181

SonarData sd4(x,y,v4); SonarData sd5(x,y,v5); SonarData sd6(x,y,v6); IISonarData sd7(x,y,v7); IISonarData sd8(x,y,v8); IISonarData sd9(x,y,v9); SonarData sdlO(x,y,vlO); vector sdlist; I I Create Sis sdlist.push _back( sd 1 ); Sonarlmage *sil =new Sonarlmage(myDataFilter,sdlist); sdlist.pop _back(); sdlist.push _back( sd 1 ); Sonarlmage *sila =new Sonarlmage(myDataFilter,sdlist); sdlist.pop _back(); sdlist.push _back( sd2); Sonarlmage *si2 =new Sonarimage(myDataFilter,sdlist); sdlist.pop _back(); sdlist.push _back( sd4 ); Sonarlmage *si4 = new Sonarlmage(myDataFilter,sdlist); sdlist.pop _back(); sdlist. push_ back( sd5); Sonarlmage *siS= new Sonarlmage(myDataFilter,sdlist); sdlist.pop _back(); sdlist. push_ back( sd6); Sonarlmage *si6 =new Sonarlmage(myDataFilter,sdlist); sdlist.pop _back(); sdlist. push_ back( sd 1 0 ); Sonarlinage *silO= new Sonarlmage(myDataFilter,sdlist); sdlist.pop _back(); I I Insert Sis into list to test matching functions I I Start with corridor then switch to other Sis II 3 corr myDebugLogger->log("Adding 3 corr to SI list"); mySonarlmages.push back(si 1 ); 168

PAGE 182

mySonarlmages.push _back( si 1 a); mySonarlmages.push_ back( si 1 ); II 2 fw myDebugLogger->log("Adding 2 four way to SI list"); mySonarlmages.push_back(si2); mySonarlmages.push_back(si2); II 1 corr myDebugLogger->log("Adding 1 corr to SI list"); mySonarimages.push _back( si 1 ); II I UT myDebugLogger->log("Adding 1 UT to SI list"); mySonarimages.push_back(si4); II I corr myDebugLogger->log("Adding I corr to SI list"); mySonarimages.push_back(si1); II alcove myDebugLogger->log("Adding I alcove to SI list"); mySonarimages.push _back( siS); I I dual alcove myDebugLogger->log("Adding I dual alcove to SI list"); mySonarimages.push_ back( si6); II L myDebugLogger->log("Adding 1 L to SI list"); mySonarimages.push_back(siiO); II Print out Sonarlmages list vector: :iterator iter; for( iter= mySonarimages.begin();iter != mySonarlmages.end(); ++iter) { myDebugLogger->log( (*iter )->toString() ); } } void Robot::initPatterns() { myDebugLogger->logt("Initializing patterns"); I I Create corridor pattern EnvFeature *corridor= new EnvFeature("corridor"); list *corrRules =new list; corrRules->push _front( new FrontLongRule(myLongThreshold) ); corrRules->push _back( new BackLongRule(myLongThreshold) ); 169

PAGE 183

corrRules->push _back( new TwoSideShortRule(myShortThreshold) ); corrRules->push _back( new CorrRule( myCorr AspectRatio) ); my Patterns. push_ back( new F eaturePatternhnpl( corridor,corrRules) ); II Create four way pattern EnvFeature *fourWay =new EnvFeature("four way"); list *fwRules = new list; fwRules->push _back( new BackLongRule( my Long Threshold)); fwRules->push_back(new FrontLongRule(myLongThreshold)); fwRules->push _back( new TwoSideLongRule(myLongThreshold) ); I lfwRules->push .:_back( new FWCornerRule(myFWCornerMatches)); myPatterns.push _back( new FeaturePattemhnpl(fourWay,fwRules )); I I Create across T pattern EnvFeature *acrossT = .new EnvFeature("across T"); list *acrossTRules = new list; acrossTRules->push_back(new BackLongRule(myLongThreshold)); acrossTRules->push_back(new FrontLongRule(myLongThreshold)); acrossTRules->push_ back( new SideLongRule(myShortThreshold,myLongThreshold) ); my Patterns. push_ back( new FeaturePattemhnpl( aero ssT ,acrossTRules) ); I I Create up T pattern EnvFeature *upT =new EnvFeature("up T"); list *upTRules =new list; upTRules->push _back( new BackLongRule(myLongThreshold) ); 1/upTRules->push_back(new FrontArcRule(myLongThreshold)); upTRules->push_back(new TwoSideLongRule(myLongThreshold)); up TRules->push _back( new FrontShortRule( myShortThreshold) );. myPatterns.push_back(new FeaturePattemhnpl(upT,upTRules)); I I Create corridor alcove pattern EnvFeature *alcove= new EnvFeature("alcove"); list *alcRules = new list; alcRules->push _back( new FrontLongRule(myLongThreshold) ); alcRules->push _back( new BackLongRule(myLongThreshold) ); alcRules->push _back( new SidelnterRule(myShortThreshold,myLongThreshold) ); 170

PAGE 184

alcRules->push _back( new SideShortRule(myShortThreshold) ) ; myPatterns.push _back( new FeaturePatternlmpl( alcove alcRules) ) ; II Create corridor dual alcove pattern EnvFeature *dualAlcove =new EnvFeature("dual alcove") ; list *dAlcRules =new Iist; dAlcRules->push_back(new FrontLongRule(myLongThreshold)); dAlcRules->push _back( new BackLongRule(myLongThreshold) ) ; dAlcRules->push _back( new TwoSideinterRule( myShortThreshold,myLongThreshold) ); myPatterns.push_back(new FeaturePattemlmpl(dualAlcove,dAlcRules)); II Create corridor end pattern EnvFeature *corrEnd =new EnvFeature("corridor end"); list '!'corrEndRules =new list ; corrEndRules->push _front( new FrontShortRule(myShortThreshold) ) ; corrEndRules->push_back(new BackLongRule(myLongThreshold)); corrEndRules->push _back( new TwoSideShortRule(myShortThreshold) ); my Patterns. push_ back(new FeaturePattemlmpl( corrEnd,corrEndRules) ) ; /I Create alcove end pattern EnvFeature *alcoveEnd =new EnvFeature("alcove end") ; list *alcEndRules = new list; alcEndRules->push _back( new FrontShortRule( myShortThreshold) ); alcEndRules->push _back( new BackLongRule(myLongThreshold) ); alcEndRules->push_back(new SidelnterRule(myShortThreshold,myLongThreshold)) ; alcEndRules->push _back( new SideShortRule(myShortThreshold) ); my Patterns. push_ back( new FeaturePattemlmpl( alcoveEnd,alcEndRules) ); II Create dual alcove end pattern EnvFeature *dualAlcoveEnd =new EnvFeature("dual alcove end"); list *dAlcEndRules =new list; dAlcEndRules->push FrontShortRule(myShortThreshold) ); dAlcEndRules->push _back( new BackLongRule(myLongThreshold) ); dAlcEndRules->push _back( new TwoSidelnterRule(myShortThreshold myLongThreshold)); 171

PAGE 185

} my Patterns push_ back( new FeaturePatternimpl( dualAlcoveEnd,dAlcEndRules) ); I I Create L pattern EnvFeature *L =new EnvFeature("L"); list *!Rules = new list; IRules->push_back(new FrontShortRule(myShortThreshold)); lRules->push _back( new BackLongRule(myLongThreshold) ); IRules->push _back( new SideShortRule(myShortThreshold) ); IRules->push _back( new SideLongRule(myShortThreshold,myLongThreshold) ); myPatterns.push_back(new FeaturePatternhnpl(L,lRules)); //Initialize the sensor suite bool Robot::initSensors() { int status = 0; I I Attach position data to sonar data status += posDataRequest(POS _SONAR); I /Configure sensors for firing status += conf_ sn(mySonarFireRate,mySonarFiringOrder ); return (status= 2)? true: false; } 172

PAGE 186

I I I Ill I Ill /IIIII/ I II I I IIIII I I I I I I II I I II I I I I I I I II II /IIIII IIIII/ I I I I I I I I I I I I I I II I I I I I I I I II I II I II II I I II I I II I II I //Name: Robot2Main.cc I I Description: II II Main for robot exploration program which identifies large-scale indoor features such as corridors, four way intersections, etc. II Created: 9/25/98 II Last Mod.: I I I I I I I I I I I I /.I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I lll/l//l///lll/1 II Includes I I I II I I II I IIIII/ #include "Robot.h" 111111111111 //Main 1/lllll///// int main(int /* argc */,char *argv[]) { } //Create robot Robot rob by( argv[ 1 ]); #ifdef SELFTEST robby.explore(); robby.report(); #else if( rob by. in it()) { } robby.explore(); rob by .report(); robby.shutdown(); #end if return 0; 173

PAGE 187

#ifudef RULE H #define RULE_ H //////l/ll/l///l///////////ll/lll/////////////////////////////ll////lll/////ll//lll/l/l//lll/l//ll////l////l //Name: Rule.h II Description: II II Base class for Rule hierarchy. Rules are used by FeaturePattemlmpl's to determine if a feature is matched by a particular set of sonar data. II Created: 11125/98 II Last Mod.: II I I I I I I I I I I I I I I I II I I I II I I II I II I II I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I II I Ill I I II I II I I I II I I I I I I I I I I II I I I I I I I I I I II Includes Ill I I I II IIIII/// I I I I I I I I I I II II I II II I I Ill II Forward Decls I I I I I II Ill///////// IIIII template class vector; class Rule { I I I Ill I ll//////////////l/111 II Public Functions I II II IIIII/// Ill ///l//l//l/1 public: I I Destructor virtual -Rule(); I I Determine if data matches virtual bool match(const vector & theData) const 0; I I I II I ///l//////l /II/II/I/ IIIII I I Protected Functions I I I I I I I I I II I IIIII/ I I II I IIIII Ill protected: I I I I I I I IIIII/ II II I I I II I IIIII II Private Functions II II I IIIII/// IIIII ////////// private: 174

PAGE 188

I I I I /lll///l/l///lllll/l/l/l// I I Protected Members I I II Ill I 1///11111 ///IIIII IIIII protected: Ill II ////////////IIIII IIIII I I Private Members I I I I I I IIIII/ //l////////ll// private: }; #endif I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I //Name: Rule.cc II Description: Implementation for Rule class. Used to provide non-/ I inline virtual destructor. II Created: 11/25198 II Last Mod.: ///l///l/l///lll/l/ll/ll//ll/ll///l///llll//l//l/ll/1/llll/l/lll//////l////ll//l//ll/ll///l/////ll/lllllllll I I I II II Ill I IIIII II Includes I II I I 11111111 Ill #include "Rule.h" /III/III/IIIII II Statics 1111111/11/111 I I I I I I I I I I I II II I I I I I I I I I II II II Public Functions I I I I I Ill II 11111111 /II/IIIII I Rule::-Rule() { } I I I II I I I I I I II I I II II I I I II /Ill Ill II Protected Functions I I I I I II I II I I I IIIII Ill/ ////Ill// 175

PAGE 189

I I I I I I I I I I I IIIII// II ///IIIII II Private Functions I I I I I I I I I I I I I I I I I I I I II I II Ill 176

PAGE 190

#ifhdefRULEIMPL H #define RULEIMPL H //l///l///l/ll/l/////lll/ll/ll/ll///l////////ll//////l//l//l///1////l///l//l/l//////l/////ll//ll//l/1/l///// II Name: Rulelmpl.h II Description : Provides implementations of functions for use by derived I I classes. Rulelmpl is derived from Rule. II Created : 11/26/98 II Last Mod.: //////ll/l/l///l/l//l//l/ll///////////////ll/l/l//////////ll//ll/l///l/////l///////l/////ll/l/l/l/l/l/l///// I I I I I I I II IIIII// II Includes I I I Ill I II 1///111 #include #include "Rule.h" I I l/l//l///ll///l/ll/l// I I Forward Decls I I I I I II Ill/ II/II/IIIII/ I I I I I I I I I I Ill Ill I II Ill II Namespaces I I I II I II II I Ill/ IIIII/ using namespace std; I I Base class class Rulehnpl : public Rule { I I I I I I I I I I I II I II Ill II I I I I I I I I I Public Functions Ill 11/l////11 111111111111 Ill public: I I Constructor Rulelmpl(); I I Destructor virtual -Rulelmpl(); 177

PAGE 191

I II /1/////l//// IIIII//// IIIII/ I I I Protected Functions I I I I Ill IIIII/ /Ill Ill Ill IIIII Ill protected: II Get data from SonarData long getBack(const vector &theData) const; long getBackArc(const vector &theData) const; long getBackLeftArc(const vector &theData) const ; long getBackR.ightArc(const vector &theData) const; long getFront(const vector &theData) const; long getFrontArc(const vector &theData) const; long getFrontLeftArc(const vector &theData) const; long getFrontRightArc(const vector &theData) const; long getLeft(const vector &theData) const; long getLeftArc(const vector &theData) const; long getRight(const vector &theData) const; long getR.ightArc( const vector &theData) const; I I 1///l///1/ Ill///////// /Ill I I Private Functions I I I /Ill Ill II /Ill l/////l/1 Ill private : I Ill /1/111111 II///////// /IIIII I I Protected Members II I I I /Ill/// I II II I I II Ill /IIIII protected: I I Ill l////ll//////// Ill /Ill II Private Members II I ll////////lll/l//// Ill II private: }; #endif 178

PAGE 192

I I II II I I I I I I I II I I II II I I I I I I I I I I II II I I I I I I I I I II II II I I I I I I I I I I I II I I I I I I I I I I I I I I I I I I I I II I I I I II I I I II I I II II II I I I I //Name: Rulelmpl.cc I I Description: II Provides implementations of functions for use by derived classes II Created: 11/26/98 II Last Mod.: I I I I I I I I I I I I I I I I I I II II I I I I I I I I Ill I I II I I I I I II I I I I I I I I I II I I I I I I I I Ill II I I I I I I I I I I I I I II I I I I I I I II II I I II II I I I I I I I I l/l/ll/l///l/l/1 II Includes I I I I I I II II IIIII/ #include "Rulelmpl.h" 1//l///l////// II Statics Ill/////////// I I I I I I I IIIII/III/ Ill IIIII Ill II Public Functions I I I I ///l//////ll/l/l////// II Rulelmpl::Rulelmpl() { } Rulelmpl: :-Rulelmpl() { } I I I I I I Ill l/l/////////ll// IIIII/ I I Protected Functions I I I I I I I IIIII 111111111111111 Ill I long Ruleimpl::getBack(const vector &theData) const { return theData[8]; } long Ruleimpl::getBackArc(const vector &theData) const { return (theData[7] + theData[8] + theData[9]); } long Ruleimpl::getBackLeftArc(const vector &theData) const 179

PAGE 193

{ return (theData[S] + theData[6] + theData[7]); } long Rulelrnpl::getBackRightArc(const vector &theData) const { return (theData[9] + theData[lO] + theData[ll]); } long Rulelrnpl::getFront(const vector &theData) const { return theData[O]; } long Rulelrnpl::getFrontArc(const vector &theData) const { return (theData[15] +theData[O] + theData[l]); } long Rulelrnpl::getFrontLeftArc(const vector &theData) const { return (theData[l] + theData[2] + theData[3]); } long Rulelrnpl::getFrontRightArc(const vector &theData) const { return (theData[13] + theData[14] + theData[15]); } long Rulelrnpl::getLeft(const vector &theData) const { return theData[4]; } long Rulelrnpl::getLeftArc(const vector &theData) const { return (theData[3] + theData[4] + theData[S]); } long Rulelrnpl::getRight(const vector &theData) const { return theData[12]; } 180

PAGE 194

long Ruleimpl::getRightArc(const vector &theData) const { return (theData[ll] + theData[12] + theData[l3]); } I I I I 111111111111111111111111 I I Private Functions I I I I I I II IIIII 111111111111111 181

PAGE 195

#ifndef SCREENLOG H #define SCREENLOG H I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I II I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I II I II I I I I I I I I II I I I I I I I II I I I I I I I II I I II Name: ScreenLog.h II Description: Derived log class which writes information to the screen. II Created: 12/14/98 II Last Mod.: II I I I I I I II I I I II I I I I I I II I I I I I II I I I I I I I I I I I I I I I I I I I II II I I I I I I I I I I I I II I I I I I II II II I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I I I I II III/IIIII/ I I Includes II I II 111/1/1 /Ill #include "Log.h" #include #include II I ll/ll////l/////// Ill/ II Forward Decls II II I Ill /l///lll/ll//1// I I I I I 11111111 IIIII Ill II Namespaces II II Ill/ l//////l/ll/ I using namespace std; class ScreenLog : public Log { I II I I II I 1!/111111 IIIII Ill Ill I I Public Functions II I I II /llll/////ll/ll/ IIIII/ public : I I Constructor ScreenLog(); I I Destructor -ScreenLog(); I I I I I I I I Ill/ /IIIII// Ill//////// I I Protected Functions II I 1111111/1111111111 l/1///// II 182

PAGE 196

protected: IIIII/ /Ill///////// /IIIII II I II Private Functions II I Ill /Ill //l////l/l//////// private: I I Write information to the screen void log(const string &aString) {cout << aString << endl;} II IIIII/// Ill/ //IIIII /Ill///// II Protected Members I I I I I I I I I I I I IIIII II II I I II /Ill I protected: II II I I I Ill I I IIIII/// //Ill// II Private Members I I I I I I I I I I II II I I I I I I I I I II I I private: }; #endif l////l///l//l//////l/l///////////////////l//////l/////1/l/l//lll/l///ll//////////////////l//l/lll/////l///// II Name: ScreenLog.cc I I Description: Log class which writes data to screen. II Created: 12/14/98 II Last Mod.: I I II I I I II I I I I I I I I I I I II I II I I I I I I I I I I I I I I I I I I II I I I II II I I I I I I I I II I I IiI I I I I I I I I I I I I I I I I I I II I II I I I I II I I I I I I I I I I II II I I I I I I I I Ill I I I II Includes I I I I I I I I I I I I I Ill #include "ScreenLog.h" ////l/ll/ll/11 II Statics /l////l/////// II I I I I I I I I II IIIII/ I /IIIII II I II Public Functions Ill I I II Ill IIIII Ill/ ////l///1 183

PAGE 197

ScreenLog: :ScreenLog() { } ScreenLog: :-ScreenLog() { } I I I ll///l//////l/l/////l/////l/ I I Protected Functions II ll///ll/l//////// Ill///////// I I I I I I I I I I I I IIIII/// /IIIII II II Private Functions Ill I I I I II ll/////l///1 1/11111 184

PAGE 198

#ifndef SIDEINTERRULE H #define SIDEINTERRULE H I I I I I I I I I I I I I I I I I I I I I I I II II I II II I I I II II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II II I I I I I I I I I I I I I II I I II I I I I I I I I I I I I I I //Name: SidefuterRule.h II Description: Rule that indicates there is a "intermediate" distance to I I at least one side of the robot. Intermediate distance II is assumed to indicate an alcove. II Created: 12/15/98 II Last Mod.: I I I I II I I I I I I I I I I I I I I I I I I I I I I I I I II I I II II I I I I I I I I I I I I I I I I I I I II I I I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I II I I II I I I Ill ll////l//1/// II Includes I I I I I II I I II IIIII #include "Rulelmpl.h" I I I I I I I I I I I IIIII/ Ill Ill/ II Forward Decls II Ill IIIII l/l//l/1/l/l/1 I I Base class class SidelnterRule : pul:5lic Rulelmpl { I I I I I I I Ill I I 1///1/11/ll////l II Public Functions II I II I I II II II ll////l//l//l// public: I I Constructor SidelnterRule(long theShortThreshold, long theLongThreshold); I I Destructor -SidelnterRule(); I I Determine if data matches virtual bool match(const vector & theData) const; I I I I I II I I I I I I II Ill II //IIIII/II/ I I Protected Functions I I I I I I I I I I I I I I I I I I Ill IIIII/I// I protected: 185

PAGE 199

I I I I lll///l///l/l/l/1 Ill/ Ill I I Private Functions I I I I I II I II I I I I I I I I I I I I I I I II I private: I I II I I Ill I I I II /Ill I I I Ill I II II I I I Protected Members I I I I I Ill II IIIII/ ////IIIII/ /Ill protected: I I I I II I Ill II II IIIII //////// II Private Members /Ill II ll/////l/////l /IIIII/ private: }; long itsShortThreshold; long itsLongThreshold; #endif I I I 1111111 I I I I /Ill I I I I I II I I II Ill Ill II II /Ill I I I I I I II I I I I I I I I I I I I II I I I I I I I 1//11111 I I I II I I I I I I I I I I I I I I I I I I I I I I I //Name: SideinterRule.cc I I Description: Implementation of rule to determine if there is at least I I one intermediate distance to the side of the robot. II Created: 12/15/98 I I Last Mod.: I I II I I I I II I I I II I I I I I I I I I I I I I I I I I II I I II I I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I II I I I I I I I I II II I I I I I II I I I I I I I Ill I I I I I I I II I II 1//l///l/1 II Includes III/II/II/IIIII/ #include "SideinterRule.h" ///IIIII/I//// II Statics ///l///l/lll/1 /ll/////l//// Ill/ II /IIIII Ill II Public Functions I I I I I I I Ill 11111111 //II/I//// 186

PAGE 200

SideinterRule::SideinterRule(long theShortThreshold, long theLongThreshold) { } itsShortThreshold = theShortThreshold; itsLongThreshold = theLongThreshold; SideinterRule: :-SideinterRuleO { } bool SideinterRule::match(const vector &theData) const { boolleft = ((getLeft(theData) <= itsLongThreshold) && (getLeft(theData) >= itsShortThreshold))? true: false; bool right= ((getRight(theData) <= itsLongThreshold) && (getRight(theData) >= itsShortThreshold))? true : false; return ((left= true) II (right= true))? true: false; } I I I I I I I I I I l/l////l/////lll/l/// I I Protected Functions I IIIII/ Ill II/II/I////// //IIIII/ I I I I I I I I I I I ll/////l/////ll// II Private Functions I I I I I llll//l/1/llllll/ 111111 187

PAGE 201

#ifndef SIDELONGRULE H #define SIDELONGRULE H I I I I I II II I I I I I I I I I I I I I I I I I I I I I II I I I I I I II II I I I II I I I II /II I II I I I I I I I II I I I I I I I I I I I I I I I I I I I I I II I I I I I I I I I I II I I II I I //Name: I I Description: II II Created: II Last Mod.: SideLongRule.h Rule that indicates there is a "long" distance to at least one side of the robot. 12/4/98 II I I I II Ill/ II I I 11111111 I II /Ill I I I II Ill I II I I IIIII I II I I Ill I Ill I I II I II I I I I I I Ill Ill Ill I Ill I I I I I I I I II Ill I I I I I I Ill /l//l///11////// II Includes II I I /1//l//l/l/1 #include "Rulelmpl.h" I II II /lll////l//ll/l//ll II Forward Decls I I I I I II III/III/II//// II I I I Base class class SideLongR.ule : public Rulelmpl { I I I I I I I I 1////// 1/1/l/l/l// II II Public Functions Ill /IIIII/ 1/1/1/l//// Ill Ill/ public: I I Constructor SideLongRule(long theShortThreshold, long theLongThreshold); I I Destructor -SideLongRule(); I I Determine if data matches virtual bool match( canst vector & theData) canst; IIIII 1111/111111111111 Ill II /Ill I I Protected Functions I I I I II I 11//l//lll/ll// Ill Ill Ill protected: 188

PAGE 202

I I II II I /Ill II I I I I I II I I I I I I I I I I Private Functions I I I I I Ill I II I II I I I I I I II I I I I II private: I I I I I I I I I I I I II I I I I I I I I I I I I I I I I I I Protected Members II II II Ill I II Ill/ Ill I Ill/ II Ill/ protected: I II I Ill Ill Ill/ I I Ill II II Ill/ I I Private Members ///////////Ill/ II I IIIII// I I private: }; long itsShortThreshold; long itsLongThreshold; #endif l////l/l/////1///l/////l///ll//ll//l/l////////////////l////////////l/l/l/ll////l///l//////////////////////// //Name: SideLongRule.cc II Description: II Implementation of rule to determine if there is at least one long distance to the side of the robot. II Created: 12/4/98 II Last Mod.: ll////////////ll////l/////l/l////l///////l/l/ll/ll///l/////////////ll////l/////ll//l/l//ll/l////l//l//////// /Ill Ill/ 1/1111 II II Includes IIIII II /Ill///// #include "SideLongRule.h" 1/1//l/11/ll// II Statics ///l//l//////l l///l//////l////l// Ill II II II II Public Functions I I I I I I I I I II I I I I I I I I I I I I I I I I I SideLongRule::SideLongRule(long theShortThreshold, 189

PAGE 203

long theLongThreshold) { } itsShortThreshold = theShortThreshold; itsLongThreshold = theLongThreshold; SideLongRule: :-SideLongRuleO { } boo! SideLongRule: :match( const vector &theData) const { boolleft = ((getLeft(theData) > itsLongThreshOld) && (getRight(theData) < itsShortThreshold))? true : false; bool right= ((getRight(theData) > itsLongThreshold) && (getLeft(theData) < itsShortThreshold)) ? true : false; return ((left= true) II (right= true))? true: false; } I I I II I I I I II I I I II II I I I I I I I I I I I I I I I Protected Functions I I I Ill/////// /////l////l/////l/ I I I I I I I I I I I I I I I I I II II I I I I I I I I I Private Functions I I II I 1///////l// 1/////1/1 Ill 190

PAGE 204

#ifndef SIDESHORTRULE H #define SIDESHORTRULE H I I I I II I I I I I I I I I II I II I I I I I I I I II II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I II I I II I I I II I I I I I II I I I II I II I I I I I I I I I I I //Name: SideShortRule.h II Description: Rule that indicates there is a "short" distance to II at least one side of the robot. II Created: 12/1198 II Last Mod.: l/////l////ll/l/l///l////////////ll/l///ll///l/l/l////l////l///ll/l///ll//ll/l/////l////////////////l/////l/ l////l////l/l/l/ II Includes l////////lll/l// #include "Rulelmpl.h" I ll//////l/// ///1//11 Ill II Forward Decls I II IIIII/ IIIII //1/1/1111 I I Base class class SideShortRule : public Rulelmpl { I I I II I I 1//111 Ill ///l///////1 II Public Functions I I I I I I I I I I I I I I IIIII Ill/ IIIII public: I I Constructor SideShortRule(long theThreshold); I I Destructor -SideShortRule(); I I Determine if data matches virtual bool match(const vector & theData) const; I I I I I IIIII// Ill /Ill//// Ill IIIII II Protected Functions I I I II I I I I I I I I I I I I I I I I I I I I I I I I I I protected: 191

PAGE 205

II I Ill///////// IIIII/I IIIII I II Private Functions /IIIII Ill/ /II/III/// IIIII/// private: I I I I I I II II /IIIII II Ill////// Ill I I Protected Members I I IIIII/ Ill II IIIII/I// ///IIIII protected: II 1/l///l// /Ill IIIII /IIIII/ I I Private Members I I I I I I I I I I I I I I Ill/ ///III/I/ private: long itsThreshold; }; #endif I I I I I I I I II I I I I I I I I I I I I I II I I I I I I I I I I I il I I II I I I I I I I I I II II I II I II I I I I I I II II I I II I I I I I I I I I II II I I I I II I I II/I I I I I I I I I II Name: SideShortRule.cc II Description: hnpleme'ntation of rule to determine if there is at least I I one short distance to the side of the robot. II Created: 12/4/98 II Last Mod.: I I I I I II I I II I I Ill IIIII I II I II I /Ill Ill/ I I I I I /Ill 1/1111////1 I I II I I II II I I I Ill /IIIII Ill I I I I II I I Ill I Ill I I II II I II Ill II I II III/II///// II Includes I II II IIIII// Ill/ #include 11 SideShortRule.h 11 /////l/1111111 II Statics //II/III////// II I I I I I I I I II I I I II I I I I I I I I I I I II Public Functions Ill l/l//l////l/1 Ill IIIII II II SideShortRule:: SideShortRule(long the Threshold) { 192

PAGE 206

itsThreshold =the Threshold; } SideShortRule: :-SideShortRuleO { } boo! SideShortRule::match(const vector &theData) const { return ((getLeft(theData) < itsThreshold) II (getRight(theData) < itsThreshold))? true: false; } Ill/ ////IIIII //IIIII l///////l/1 I I Protected Functions I ll////l////l/l/ll/ Ill 1////l/// II I I ll///l////////l//l////// I I Private Functions I I I I I I Ill I /1//////// //Ill/// 193

PAGE 207

#ifndefSONARDATA H #define SONARDATA H I I I I II II I I Ill/ I I I I I II I I I I I I I I I II I I I I I Ill I I I I I I I I I I I I I II Ill I I I I I I I I I I I Ill I I I I II I I I I I I I I Ill I I II I I I II I I I I I II I I I //Name: SonarData.h I I Description: Class which represents the set of sonar data taken at I I a particular position. II Created: 10/20198 II Last Mod.: I II I I Ill I 11111111 I II I I I 111111111 I I Ill/ I I I II I I I I IIIII I I I II I II II I 1111111 I I II I Ill I 11111111 I I I I I Ill Ill I II I I I I I II 1111111111111111 II Includes /lllll///l///ll/ #include I I I Ill/ 11111111111111111 II Forward Decls ll/l/l/l/1/llll/lllll/1/ class ostream; II I I lll/l//l/ll/l/ll/ I I Namespaces II ll/lllllll/1//ll/// using namespace std; class SonarData { ll/1/llll/lllllll/lllll//l/1 II Public Functions I I Ill IIIII// /l/l/ll/1/l// Ill public: II Default ctor SonarDataO; /fetor which takes x posit, yposit, and vector of sonar data SonarData(long xData, long yData const vector &sonar Pings); llctor which takes x posit, yposit, and vector of sonar data 194

PAGE 208

SonarData(float xData, float yData, const vector &sonarPings); //dtor -Sonar Data(); //copy ctor SonarData(const SonarData &data); flop= Sonar Data& operator=(const SonarData &data); I I Add a ping to itsSonarPings void add(long theSonarPing) {mySonarPings.push_back(theSonarPing);} const vector &getSonarPings() const {return mySonarPings;} float getXPosit() const {return myXPosit;} float getYPosit() const {return myYPosit;} II Set timestamp ofSonarData void setTime(); II Set X position void setXPosit(long theXPosit); void setXPosit(float theXPosit); I I Set Y position void setYPosit(long the YPosit); void setYPosit(float the YPosit); char* toString() const; friend ostream & operator<<(ostream & os, const SonarData & data); I I I I II I II II/IIIII/I /////1// /Ill I I Protected Functions I I II I II /11111111111111 IIIII II II protected: I I I II I I II /1/ll//l/l//////// I I I Private Functions I II lll/1/////////////////// I private : 195

PAGE 209

I II I II /11/l//////////l/111 Ill/ I I Protected Members I I I I I I I I II I I I Ill I II I Ill/////// protected: I I I I I I I I I I Ill III/II//////// I I Private Members I I I I I I I I I I 111111/lll//////1 private: I /timestamp for sonar readings long mySeconds; long myMilliseconds; I IX position of robot when sonar readings were taken float myXPosit; IN position of robot when sonar readings were taken float myYPosit; vector mySonarPings; }; #endif ///l/////////l/l////ll/////////l///lll/l///l/ll//ll////l///l/ll////////l//1/// / ll/////l/////ll////lll/l/l/// //Name: SonarData.cc I I Description: Implementation for a class which sonar data obtained II from the Scout robot. II Created: 10/20/98 II Last Mod.: /l////l/l/l/////////ll/l/l///l/l/////l////////l///l/ll/l/////ll//l////ll//////////////l/////l////ll///ll/l// I I I IIIII/// II Ill II Includes //////IIIII///// #include "SonarData.h" #include #include #include #include II For gettimeofday 196

PAGE 210

/////III/I//// II Statics //l////l////ll I I I IIIII// Ill /Ill //III/I//// II Public Functions I I I Ill/ l//////////ll// IIIII/ SonarData::SonarData() { } I I Set timestamp setTime(); myXPosit = 0.0; myYPosit = 0.0; SonarData::SonarData(long xData, long yData, const vector & sonarPings): mySonarPings( sonar Pings) { } I /Set timestamp setTime(); I I Convert data from long to float setXPosit(xData); set YPosit(yData ); SonarData::SonarData(float xData, float yData; const vector & sonarPings): mySonarPings( sonar Pings) { //Set timestamp setTime(); myXPosit = xData; myYPosit = yData; } //copy ctor SonarData::SonarData(const SonarData &data) : mySonarPings( data.mySonarPings) { mySeconds = data.mySeconds; 197

PAGE 211

myMilliseconds == data.myMilliseconds; myXPosit == data.myXPosit; myYPosit == data.myYPosit; } Sonar Data: :-SonarDataO { } /lop= SonarData & SonarData::operator=(const SonarData &data) { } if(&data !==this) { } mySeconds = data.mySeconds; myMilliseconds == data.myMilliseconds; myXPosit == data.myXPosit; myYPosit == data.myYPosit ; mySonarPings = data mySonarPings; return *this; void SonarData::setTime() { } //Get timestamp struct timeval temp Time ; struct timezone tempZone; gettimeofday( &tempTime &tempZone ); mySeconds == tempTime tv_sec; myMilliseconds == tempTime tv_usec; void SonarData::setXPosit(long theXPosit) { myXPosit == theXPosit/10.0; } void Sonar Data:: setXPosit( float theXPosit) { myXPosit = theXPosit; } 198

PAGE 212

void SonarData::setYPosit(long theYPosit) { myYPosit = theYPosit/10.0; } void SonarData::setYPosit(float theYPosit) { myYPosit = theYPosit; } char* SonarData: :toString() const { } ostrstream oss; oss <<"Time:"<< mySeconds << ".11 << myMilliseconds << endl; oss << "X: << myXPosit << Y: << myYPosit << endl; for( size_ t i=O;i
PAGE 213

#ifudef SONARIMAGE H #define SONARIMAGE H //l/l/l/ll/l/l///////ll/////l////lllll/l/l/l///1///l///l///l//////l////ll/////l////l//l///l/ll/l//ll///l//// II Name: Sonarlmage.h II Description: A Sonarlmage is a filtered SonarData created from N II SonarDatas. Can be compared to FeaturePattems to II determine what feature the image represents. II Created: 10/31198 II Last Mod.: 12/22/98 II Added op<< II 1/9/99 I I Added matchPair vector and operations II modified op<< & toString I I I I I I I I I I II Ill II I II I I I II I I I I I I I I I I I I I /IIIII I I II I I I I I II I I I I I I Ill I I I I I I I I I II I I II I I II Ill I I I II I II I I I I I I I Ill I I I I I I I I I III/II/I/// II Includes 11//lll//l///l// #include #include #include #include #include "SonarData.h" I I I II l//1///ll/l/////l/1 I I Forward Decls I II Ill ll/////l///l///// I II For Feature type II Container for raw sonar data II For Feature type/percent match pairs II Holds filtered sonar data class DataFilter; class ostream; II Filter for raw data //For op<< I I I I I I l//1111111/ll// I I Namespaces I I I I I I l/////ll//l/1 II using namespace std; class Sonarlmage { I Ill I /llll/l//l/////// Ill Ill II Public Functions 1 I I II I l/////////l///lll///// 200

PAGE 214

public: I I typedef for matchPair typedef pair matchPair; I I Constructors Sonarlmage(DataFilter &theFilter, vector I I Destructor -Sonar Image(); II Provide filtered data to a client const SonarData& getlmage() const {return itsFilteredData;} I I Provide best matches to a client const vector& getMatches() const {return itsMatches;} I I Add matchPairs to Sonar Image void setMatches( const vector& theMatches ); II Convert Sonarlmage (filtered data) to a string char* toString() const; friend ostream& operator<<(ostream &os, const Sonarlmage &image); I I I I I I I II I I I I I I I I I I II I I I I I II Ill I I Protected Functions I I I I I I I I I I I I I I I I I II I I I I I I I I I I II protected: I I I I I Ill I I II 11111111 II Ill Ill I I Private Functions II I I Ill Ill////// Ill II/////// private: Sonarlmage(const Sonarlmage &orig); Sonarlmage& operator=(const Sonarlmage &orig); void filter(); void setXPosit(); 201

PAGE 215

void setYPosit(); II I IIIII /Ill I I II Ill I I I Ill I II II I I Protected Members I I 1/111111 /Ill II IIIII IIIII Ill I protected: I I II I I I II I I Ill II I II I I Ill I I I II Private Members I I I I I I II IIIII/ II Ill /IIIII II private: mutable SonarData DataFilter& itsFilteredData; itsDataFilter; vector& itsRawData; vector itsMatches; }; #end if I I I I I II I I I I I I I I I I I I I I I I I II I I II I I I II I I I I I I I I I I II I I Ill II I I I I Ill I I I I I I I II/IIIII I I IIIII/I I Ill I I I I Ill I I I I I I I II I I I II Name: Som1rlmage.cc II Description: Implementation for Sonarlmage class. II Created: 10/31/98 II Last Mod.: 12122198 II Added op<< for displaying Sonarlmage II 119199 II Added matchPair operations, modified op<< and toString I I I /Ill I I I I I I II II I I I I I I I I II II I I I I I I I I I I I I II Ill I II IIIII I II I I II I I I I I I I I I II I I I I I I I I I I I II I Ill I I IIIII II II I I I IIIII I I I II IIIII/III II II Includes I I I I I I II I I II II II #include #include #include #include #include 202

PAGE 216

#include "Sonarlmage.h" #include II For filtering of raw data 11111111111111 II Statics 11111111111111 I I I II 11111111111111111111 Ill I I Public Functions I 11111111111111 IIIII /1111111 Sonarhnage: : Sonarhnage(DataFilter &theFilter, vector<;:SonarData> &theSonarData): itsDataFilter( theF ilter ), itsRawData( theSonarData) { filter(); setXPosit(); setYPosit(); } Sonarhnage: :-Sonarhnage() { } void Sonarlmage::setMatches(const vector& theMatches) { itsMatches = theMatches; } char* Sonar Image: :toString() const { ostrstream oss; oss << itsFilteredData; if(!itsMatches.empty()) { oss <<"Best Feature Matches:"<< endl; vector: :const_iterator iter; for(iter = itsMatches.begin(); iter != itsMatches.end(); ++iter) { 203

PAGE 217

} } } oss << (*iter).first << " << (*iter).second << endl; oss << '\0'; return oss.str(); ostream& operator<<(ostream &os, const Sonarlmage &image) { } os << image.getlmage(); if(!image.getMatches().empty()) { os <<"Best Feature Matches:"<< endl; vector: :const_ iterator iter; for(iter = image.getMatches().begin(); iter != image.getMatches().end(); ++iter) { } } os << (*iter).first << " << (*iter).second << endl; return os; I I I I I I II I Ill I Ill I Ill I Ill/ /IIIII I I Protected Functions I I I I I I I I I I II I Ill/ I I I I Ill/ /IIIII I I I I Ill/ II /IIIII/ 111111 /Ill I II Private Functions I I /Ill 11111//lll//llll//1/// void Sonarlmage::filter() { II Steps that Sonar Image must perform before giving raw data to filter II Need to decide whether "bad" data points will be thrown out before or after II filtering vector rawData(itsRawData.size()); II Get vectors of data to give to filter and filter data for(int i=O;i<16;++i) { 204

PAGE 218

for( size_ t j=O;j xPosits(itsRawDataosize()); for(size_t i=O;i(accumulate(xPositsobegin(),xPosits oend(),OoO)Ix 0 Positsosize()) ); } void Sonarlmage::setYPosit() { } vector yPosits(itsRawDataosize()); for( size_ t i=O;i(accumulate(yPositsobegin(),yPositsoend(),OoO) lyPositsosize())); 0 205

PAGE 219

#ifndef SUBJECT H #define SUBJECT H I I I I I I I I I I I I I I II I I Ill I I I I I I I II I I II I I I I I I I I I I I II I I I I I I I II I II I I I II I I I I I I I I I I I I II I I I I I I I II I /Ill I I I I I I II II I I I I II //Name: Subject.h II Description: II II Class which is monitored by other classes and provides updates to these Observers Based on the Observer pattern in Design Patterns. II Created: 10/12/98 II Last Mod.: I I I I I I I I I I I I I I Ill II I I I I II I I II I II I I I II I I I I I II I I I I I I I II I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I II I I I II I I I I I I I lll//////l/l//// II Includes ///l/lll////ll/1 #include #include #include #include "Observer.h" I I I I I II II II I //////l///1/ II Forward Decls I I I IIIII/ ////l/////ll/l/ /l///l/l///l/l/////// II Namespaces I I I I I II Ill I I Ill 111111 using namespace std; class Subject { I I I I I I I II II Ill//////// 1//111 I I Public Functions I ll/l//////l/11 1111/1111 /Ill public: I I Destructor virtual = 0; I I Add an Observer virtual void attach( auto _ptr an Observer); I I Send notification to Observers 206

PAGE 220

virtual void notify( const string &aString); I II I Ill/ I Ill I I I Ill/ I IIIII IIIII/ I I Protected Functions I I I II II I IIIII Ill Ill I I I I I I I II II I protected: II Get Observers const list >& getObs() const {return observers;} I IIIII//// I I I Ill I II IIIII/I// I I Private Functions I Ill ll//l/l////l II IIIII IIIII private: I I I II II IIIII I Ill I I I II II II IIIII I I Protected Members IIIII I II 1////////l///l/// IIIII protected: I ///////Ill 1///// Ill/////// I I Private Members I I I II Ill IIIII I I Ill/ II I I /Ill private: list > observers; }; #endif ///ll///l/lll/llll/l/l/ll///l////////////ll//////ll//ll/l///ll///////l/l///////l//l///1///lll///lll/l/////ll II Name: Subject.cc II Description: Implementation for Subject class. I I Created: l0/15/98 II Last Mod.: I I I I I I I I I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I I I I I II I I I I I I I I I I I I I I I I I I I I Ill II I I I I I I I I I II I I I Ill I I II I II Includes I I I I I I I Ill/ Ill I I #include #include "Subject.h" 207

PAGE 221

II/II/II/III// II Statics Ill//// I////// I II I I I I I I I I I Ill II I I I I I I I I I I I II Public Functions I Ill II I II I /IIIII/ ll//l/1//// Subject::-Subject() { } void Subject: :attach( auto _ptr an Observer) { observers.push _back( anObserver); } void Subject:: notify(const string &aString) { list >::iterator iter; for(iter = observers.begin(); iter != observers.end(); ++iter) } { (*iter )->update( aString); } I I I I I II II I I I II I IIIII Ill I I I I I II I I I Protected Functions I I I II I II II /IIIII/ Ill 11////l/111 I Ill II I IIIII/ 111/111/1111111 I I Private Functions I I II/IIIII IIIII/ Ill Ill IIIII/ 208

PAGE 222

#ifndef TIM:EST AMP H #define TIM:EST AMP H I I I I I I I I I I I I II I I I I I I I I I /Ill Ill II I II I I I I I I IIIII I I II II I I I II I I I I I II I I I I Ill I I I II II I /Ill I I I I II I I I II I I I I II I I I I I I I I //Name: Timestamp.h II Description: II Representation of a timestamp used in logging information for the robot II Created: 12/31/99 II Last Mod.: I I I I I I I I I I II IIIII/ I I I I I II I I I I I Ill II /Ill I I I II I II II I I I I II I I I I I I I I I I I I I I I II I II II I /Ill II II I II I I I I I I I I I I Ill II I I I I l////ll/ll//l/11 II Includes ////l/l////l/l// #include II For access to time functions I I I I l//////////l/l/l//// II Forward Decls Ill llllllll/1/1/11111111 II I I I I I ll////l///lll/ I I Namespaces Ill I I I I I I I I I I II IIIII/ class Timestamp { II I /Ill ll/////l///ll//l/l/// II Public Functions I I I I I I Ill ll////l///////lll/l public: I I Get time of form Month Day HH:MM:SS Year static const char* timestamp(); I I Get mill seconds static long getUsec(); I I Get seconds from epoch static long getSec(); I I Ill II ll/l/l///ll/////ll////// I I Protected Functions I I I I I II IIIII I I I I II I II I I I I IIIII I 209

PAGE 223

protected: l/ll////l/l/////ll/l///lll/1 I I Private Functions l/111/l/////////lll/l///l/// private: IIIII/ I Ill II I /Ill I IIIII// I I I II I I Protected Members I IIIII// Ill I /Ill I II I I II I /Ill II protected: I I I I Ill Ill/ I I I I I Ill I I I I I II I I I Private Members llll///l//ll/ll//ll/l//l/// private: static struct timeval a Time; static struct timezone aZone; }; #endif I II I Ill II I I Ill I I II II II Ill I II Ill I IIIII II I I /Ill /Ill II I I I I I I /Ill I Ill I IIIII II Ill I Ill I I II II I /IIIII I Ill /Ill Ill/ I I I //Name: Timestamp.cc I I Description: Implementation to provide timestamps for log information I I Created: 12/31/98 II Last od.: /l/ll///llll/ll/l//l/lll//l/l/ll////l//lll/l/////ll///////llll///l/l//l////l/lll/lll//l//llll/l/l///l///ll// l/l//l//l/l//l/1 II Includes /ll//l/ll//l//// #include "Timestamp.h" #include /////III/I//// II Statics ////ll/ll///l/ struct timeval Timestamp::aTime; struct timezone Timestamp::aZone; 210

PAGE 224

Ill Ill/ I Ill II I I I I II I I I II I I I I II Public Functions ll/l/l/l/////l/lll//////ll// const char* Timestamp: : timestamp() { } gettimeofday( &a Time &aZone ) ; time_t theSeconds = aTime tv_sec; string strtime( ctime( &theSeconds) ) ; strtime replace(strtime.find("\n" 0),2,"") ; return strtime c str(); long Timestamp::getUsec() { gettimeofday( &aTime &aZone ); return aTime tv_usec; } long Timestamp::getSec() { gettimeofday( &a Time,&aZone ) ; return aTime.tv_sec ; } I I I II I II II I ll///l/ll///////l/// II Protected Functions I I Ill I I I II I I ll//l/ll/////l///// II II I I II I I I /Ill I I II /Ill I I II I I I Private Functions Ill II I I II I Ill I I I II I I I I II I I II 211

PAGE 225

#ifndefTWOSIDEINTERRULE H #define TWOSIDEINTERRULE H l////lll//////ll/ll////ll/l////ll////l////ll////l////////ll////l///ll/ll//l///l///l/l/l////////ll///l//l//// //Name: TwoSidelnterRule.h II Description: Rule that indicates there is a "intermediate" distance II on two sides of the robot. Intermediate distance I I is assumed to indicate an alcove. II Created: 12115/98 II Last Mod.: ll//l/1/ll/lll////l/l////l/l///l//////ll//l/////l////////l////ll////l///////ll////l/l//l/l///////l////1///// //l/////lll//l/1 II Includes /l/////l/lll/l// #include "Rulelmpl.h" II ll/ll/////l//l/l////// II Forward Decls I Ill ll/l///l/l///l////// II Base class class TwoSidelnterRule : public Rulelmpl { /l///////l/l/l//////l/////// I I Public Functions /ll/lll/l///////ll//l//l//// public: II Constructor TwoSidelnterRule(long theShortThreshold, long theLongThreshold); I I Destructor TwoSidelnterRule(); II Determine if data matches virtual bool match(const vector & theData) const; I II I l/ll///llll///////lll////// II Protected Functions I I I !Ill I II I I II I I II I I I I I Ill/ I I II protected: 212

PAGE 226

I I I II I Ill 1/1111 II I II I I Ill II I II Private Functions I IIIII/ /1////// II Ill II /IIIII private: I I I II I I I I I I I II I I I II Ill/ IIIII// II Protected Members I I I II II ll//ll///l//l II//////// protected: I I I I I I I I I I II I I I I Ill I II I I Ill II Private Members I ll/l////l/// II/////// IIIII private: }; long itsShortThreshold; long itsLongThreshold; #end if I I I I I I I I I If/ I I I I I I I I I I I I I I I I I I I I I I It I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I //Name: TwoSidelnterRule.cc //Description: hnplementation of rule to determine if there are two I I intermediate distances to the side of the robot. II Created: 12/15/98 II Last Mod.: I I I I II I II I I I I I I I I I I I I I I I I I I I II I I II I I I I I I I I I I I I I I I I II I I I I II II I I I I II II I I I I II I I I II I I II I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I II I IIIII II Includes I I I I Ill /Ill /Ill I #include "TwoSidelnterRule.h" ////////////// II Statics ////////////// I II Ill/ IIIII/I/ IIIII II IIIII/ II Public Functions I I I I I I I I I I I I I I I I I I I I I I I I I I I I 213

PAGE 227

TwoSideinterRule::TwoSideinterRule(long theShortThreshold long theLongThreshold) { } itsShortThreshold = theShortThreshold ; itsLongThreshold = theLongThreshold ; TwoSideinterRule::TwoSideinterRule() { } bool TwoSideinterRule::match(const vector &theData) const { boolleft = ((getLeft(theData) <= itsLongThreshold) && (getLeft(theData) >= itsShortThreshold))? true: false ; bool right= ((getRight(theData) <= itsLongThreshold) && (getRight(theData) >= itsShortThreshold))? true: false; return ((left= true) && (right= true))? true : false; } I I II II II II 111111111111111111111 II Protected Functions I I I I 111111111111111111111111111 1111111111111111111111111111 II Private Functions I I I I I I Ill 1111111111111111111 214

PAGE 228

#ifndefTWOSIDELONGRULE H #define TWOSIDELONGRULE H /ll/l/ll///l/l////l//ll/ll/l///ll///l//////ll//l/l////ll/l//////l/////ll////l///////ll/////l//l///////l///// //Name: TwoSideLongRule.h I I Description: II Rule that indicates there is a "long" distance on both sides of the robot. II Created: 12/1/98 II Last Mod.: I I II I II I I II IIIII/I// Ill IIIII I Ill I I I I I I I I I /Ill I I I I I I II I I I Ill Ill Ill I I II I I I I I I I I I II Ill/ Ill II IIIII Ill/ I Ill I I /Ill //l/////lll/l/11 II Includes ///l/////l///l/1 #include "Rule!mpl.h" II I II I I ll///l//l/lll//// II Forward Decls II I II II I II I ll//ll/////l/ I I Base class class TwoSideLongRule : public Rulelmpl { ll///l//l//l/ll//l////l//l// II Public Functions /ll//////l//ll///ll/l/////// public: II Constructor TwoSideLongRule(long theThreshold); I I Destructor TwoSideLongRuleO; II Determine if data matches virtual bool match(const vector & theData) const; I I I Ill/ ll//l//ll///////lll////1 I I Protected Functions II I I I I ll/lll/l/ll///l/ll/ll//11 protected: 215

PAGE 229

111/1 II I ll////l////////ll/// I I Private Functions II II l!/l////l///ll/// IIIII// private: II II I I I I I I I I lll/l///l//l/l//// I I Protected Members I II I I Ill Ill/ Ill I II I Ill //Ill// I protected: II I ll//l//////ll/ll/l////// I I Private Members I I II I II I I ll/l///l//l/!/l/l/ private: long itsThreshold; }; #end if //l!!l////ll/ll/!/l//ll///l/////ll//l////l/ll/l//l///l///l/l//l//!/ll/l/l///l////l/lll/ll/ll/l//l/lll///l/l/ //Name: TwoSideLongRule cc II Description: II Implementation of rule to determine if there are two long distances to the side of the robot. II Created: 12/4/98 II Last Mod.: I I I I I I I I I I II I Ill/ I I I I IIIII Ill/ I I II I I II I I I I I I I I I I I I I I I I II I I/////// I I II I II I I I I I II I I II II I I II II I I I II I I II Ill/ /Ill lllll///ll//l/l/ II Includes //l/1/ll////l//l #include "TwoSideLongRule.h" //l/1/l/////ll II Statics III/II/III/!/! I I II II II I I I /lll///!/l/lll/11 II Public Functions I II I I II Ill II/II/III/II/I/ Ill TwoSideLongRule::TwoSideLongRule(long theThreshold) 216

PAGE 230

{ itsThreshold = theThreshold; } TwoSideLongRule: :TwoSideLongRule() { } bool TwoSideLongRule::match(const vector &theData) const { return ((getLeft(theData) >= itsThreshold) && (getRight(theData) >= itsThreshold)) ? true : false; } I I II I I I I I II IIIII/// l/ll///l//// I I Protected Functions I I II Ill/ Ill I I I I I II I I II I Ill I I I I I I I I ll///////l//l/ll/////l/// I I Private Functions I II II I I II ll///l/ll//////l/// 217

PAGE 231

#ifudefTWOSIDESHORTRULE H #define TWOSIDESHORTRULE H I I I I I I I I I I I II Ill IIIII I II I Ill Ill I I I I I I I II Ill/ I II I Ill I I I II I I I Ill I I I I I I I /Ill I I I I I I IIIII/ I II I I I I Ill II I I I I I I I I I I I //Name: TwoSideShortRule.h II Description: Rule that indicates there is a "short" distance on I I both sides of the robot. I I Created : 12/1/98 II Last Mod : I I I Ill Ill/ IIIII I Ill I II I I II I II /IIIII I I I I I I /IIIII/ I Ill I I I I I II II I I I I I II I I /Ill Ill I Ill I /Ill Ill/ I I II II IIIII I I I I I I I //l/l/ll//////l/ II Includes ll////1//lll//ll #include "Ruleimpl.h" I lllll/llllll//1/////1// I I Forward Decls I II I ll//l/lll/l/l/l////l II Base class class TwoSideShortRule : public Ruleimpl { I II II ll////l///l////////l/l/ I I Public Functions ll//l//l/l////l//l//ll////l/ public: I I Constructor TwoSideShortRule(long the Threshold); I I Destructor TwoSideShortRule(); I I Determine if data matches virtual boo} match(const vector & theData) const; II I II I Ill I I I I /Ill I /IIIII I I I I Ill I I Protected Functions I I I II lll/l/l///lll//ll/lll/lll/ protected: 218

PAGE 232

I I I I I II II II/IIIII II/IIIII Ill I I Private Functions I I I II II I Ill/ Ill Ill/ II IIIII// private: I I I I I I I I I Ill Ill/ IIIII/I II IIIII II Protected Members I I I I I I I I I I II I I Ill I II IIIII/II// protected: IIIII// /l//l////lll/11 Ill II I I Private Members I I IIIII/ II l/////l/ll///1/ 11 private: long itsTbreshold; }; #endif I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I 11 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I //Name : TwoSideShortRule.cc I I Description: Implementation of rule to determine if there are two I I short distances to the side of the robot. II Created: 12/4/98 II Last Mod.: I I I I I IIIII/ II I Ill/ I I II I I I /Ill II Ill II II II I Ill/ I I II I I I I I I I I I Ill I I I I I Ill I II I I I I II I I I I I I I I I I II I I I I II I I I I I I I I I I II I I I I I I I I IIIII/I I II Includes l///ll/l/l//l/// #include "TwoSideShortRule.h" ///////1/llll/ II Statics III/II/III//// II I I II I I IIIII !11111/l/11/1// II Public Functions I I I I I II /IIIII/I/ Ill/ IIIII Ill TwoSideShortRule::TwoSideShortRule(long theThreshold) 219

PAGE 233

{ itsThreshold = theThreshold; } TwoSideShortRule::TwoSideShortRule() { } bool TwoSideShortRule::match(const vector &theData) const { return ((getLeft(theData) <= itsThreshold) && (getRight(theData) <= itsThreshold))? true: false; } I I I I I I I I II I I I I I I I I I I I II I I I I I I I I I I Protected Functions I II 1111111111111111111111111111 I I I I II I 111111111111111111111 I I Private Functions I I I I I I I I I I I 11111111111111111 220

PAGE 234

Appendix B: Program Output and Configuration File The output from a typical execution of the computer program and the configuration file used by the program are presented in this appendix. The output is from a short traversal of a corridor. Logs which contain information relating to feature identification, sonar data collection, and debugging are presented in that order. The configuration file robot.cfg, is presented last. 221

PAGE 235

Feature Identification Log Start log: Sun Feb 7 12:30:37 1999 Sun Feb 7 12:30:42 1999: Type: corridor Start: 4.9 End: 4.9 Sun Feb .7 12:30:45 1999: Feature Type: corridor Start: 4.9 End: 12.4667 Sun Feb 7 12:30:48 1999 : Feature Type: corridor Start: 4.9 End: 20.2167 Sun Feb 7 12:30:51 1999: Feature Type: corridor Start: 4.9 End: 27.55 Sun Feb 7 12:30:55 1999: Feature Type: corridor Start: 4.9 End: 38.45 Sun Feb 7 12:30:58 1999: Feature Type: corridor Start: 4.9 End: 46.0333 Sun Feb 7 12:31:01 1999: Feature Type: corridor Start: 4.9 End: 53.15 Sun Feb 7 12:31:03 1999: Feature Type: corridor Start: 4.9 End: 53.15 Sonar Images: Time: 918415842.62862 X: 4.9Y: 0 0:2241: 122 2: 86 3 :514:49 5: 74 6: 146 7: 145 8: i65 9:9710:94 11:50 12:49 13: 50 14: 204 15: 169 Best Feature Matches: 1 00 corridor 75 alcove 66.6667 four way 222

PAGE 236

Time: 918415845.92774 X: 12.4667 Y: 0 0:2311: 110 2: 80 3: 514:49 5:52 6: 132 7 : 137 8 : 136 9:99 10: 124 11: 5012:49 13:50 14:214 15: 177 Best Feature Matches : 100 corridor 75 alcove 66.6667 four way Time: 918415848 122776 X: 20.2167 Y: 0 0: 223 1: 110 2: 74 3: 514: 50 5: 51 6: 165 7: 156 8: 146 9: 105 10: 142 11: 51 12:49 13: 50 14 : 215 15: 178 Best Feature Matches : 100 corridor 75 alcove 66.6667 four way Time: 918415851.172788 X: 27.55 Y : 0 0:223 1: 118 2:69 3:514:50 5 : 54 6 : 197 7: 160 8 : 148 9 : 111 10: 143 11:50 12:48 13: 49 14: 207 15: 188 Best Feature Matches : 100 corridor 75 alcove 66 6667 four way Time: 918415855.522787 X: 38.45 Y: 0 0:221 1: 132 2: 69 3:514: 50 5: 51 6 : 191 7: 159 8: 158 9 : 121 10: 150 11:61 12:48 13:49 14: 198 15: 171 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415858 532783 X: 46.0333 Y: 0 0: 234 1: 125 2: 73 3: 51 4: 50 5: 57 6 : 157 7: 162 8: 164 9: 126 10: 140 11:49 12: 48 13: 49 14 : 214 15: 164 Best Feature Matches : 100 corridor 75 alcove 223

PAGE 237

66.6667 four way Time: 918415861.392786 X: 53.15 Y: 0 ;.r.-0: 227 1: 118 2: 77 3: 51 4: 50 5: 52 6: 183 7: 152 8: 169 9: 127 10: 160 11: 49 12:48 13: 49 14: 184 15: 158 Best Feature Matches: 1 00 corridor 75 alcove 66.6667 four way End log: Sun Feb 7 12:31:03 1999 224

PAGE 238

Sonar Data Collection Log Start log: Sun Feb 7 12:30:37 1999 Sun Feb 7 12:30:39 1999: So:narData collected Time: 918415839.381796 X: 1.7 Y: 0 0:238 1: 122 2: 89 3: 51 4: 49 5 : 54 6: 93 7: 156 8: 126 9: 122 10: 90 11: 51 12: 49 13: 50 14: 202 15: 154 Sun Feb 7 12:30:39 1999: SonarData collected Time: 918415839.791776 X: 2.7Y: 0 0:238 1: 122 2: 89 3:514:49 5: 54 6: 93 7: 174 8: 144 9: 89 10: 9111:50 12:49 13: 50 14: 191 15: 153 Sun Feb 7 12:30:40 1999: SonarData collected Time: 918415840.451777 X: 4.4 Y: 0 0:210 1: 124 2: 87 3: 514:49 5: 53 6: 184 7: 147 8: 168 9: 93 10: 93.11: 50 12: 49 13: 50 14: 191 15: 153 Sun Feb 7 12:30:40 1999: SonarData collected Time: 918415840.931784 X: 5.5 Y : 0 0: 222 1: 123 2: 86 3: 51 4: 49 5 : 114 6: 159 7: 132 8: 168 9: 93 10: 93 11: 50 12: 49 13: 50 14: 228 15: 150 Sun Feb 7 12:30:41 1999: SonarData collected Time: 918415841.401790 X: 6.7Y: 0 0: 219 1: 121 2: 86 3: 51 4:49 5 : 114 6: 159 7: 132 8: 191 9: 95 10: 128 11: 50 12 : 49 13: 50 14: 217 15: 203 Sun Feb 7 12:30:42 1999: SonarData collected Time: 918415842.61782 X: 8.4 Y: 0 0:219 1 : 121 2: 84 3:51 4:49 5:56 6: 193 7: 131 8: 196 9: 95 10: 73 11:50 12:49 13:50 14: 198 15: 206 Sun Feb 7 12:30:42 1999: Sonarlmage created Time: 918415842.62862 X:4.9Y: 0 225

PAGE 239

0:224 1: 122 2 : 86 3:51 4:49 5: 74 6: 146 7: 145 8: 165 9:97 10: 94 11:50 12:49 13:50 14: 204 15: 169 Sun Feb 7 12:30:42 1999: Feature Type: corridor Start : 4 9 End: 4.9 Sonar Images: Time: 918415842 62862 X:4.9Y: 0 0: 224 1: 122 2: 86 3: 51 4: 49 5: 74 6 : 146 7: 145 8: 165 9: 97 10: 94 11: 50 12: 49 13: 50 14 : 204 15: 169 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Sun Feb 7 12:30:42 1999 : SonarData collected Time: 918415842.501787 X: 9.4 Y: 0 0:255 1: 85 2: 83 3: 514: 50 5 : 54 6:203 7: 140 8: 135 9: 97 10: 73 11: 50 12:49 13:50 14: 198 15: 206 Sun Feb 7 12:30:42 1999 : SonarData collected Time: 918415842.921776 X : 10. 5 Y : 0 0:2161: 118 2: 813:514:50 5:54 6:203 7: 140 8: 135 9 :9710: 133 11:50 12:49 13:50 14: 221 15: 176 Sun Feb 7 12:30:43 1999 : SonarData collected Time : 918415843.541888 X: 12 Y : 0 0 :2161: 118 2:813:514:49 5 : 54 6:95 7: 125 8: 134 9:9810: 13411:5012:4913:50 14 : 220 15: 165 Sun Feb 7 12:30 : 44 1999 : SonarData collected Time: 918415844 1762 X: 13 Y: 0 0:255 1 : 116 2: 80 3:514:49 5:51 6 : 95 7: 139 8: 138 9: 100 10: 13411:50 12:49 13: 50 14: 220 15: 165 Sun Feb 7 12:30:44 1999: SonarData collected Time : 918415844.671785 X : 14 1 Y: 0 226

PAGE 240

0:223 1: 115 2: 79 3:514:49 5:51 6:95 7: 139 8: 138 9: 100 10: 136 11: 50 12:49 13:50 14: 209 15: 207 Sun Feb 7 12:30:45 1999: SonarData collected Time: 918415845.91783 X: 15.8 Y: 0 0: 224 1: 113 2: 79 3: 51 4: 50 5: 51 6: 101 7: 143 8: 137 9: 102 10: 137 11: 55 12:49 13: 50 14: 220 15: 147 Sun Feb 7 12:30:45 1999: Sonarlmage created Time: 918415845.92774 X: 12.4667 Y: 0 0:231 1: 110 2: 80 3: 514:49 5:52 6: 132 7: 137 8: 136 9: 99 10: 124 11:50 12:49 13:50 14:214 15: 177 Sun Feb 7 12:30:45 1999: Feature Type: corridor Start: 4.9 End: 12.4667 Sonar Images: Time: 918415842.62862 X:4.9Y: 0 0:2241: 122 2: 86 3:514:49 5:74 6: 146 7: 145 8: 165 9:97 10:9411:50 12:4913:50 14: 20415: 169 Best Feature Matches: 1 00 corridor 75 alcove 66.6667 four way Time: 918415845.92774 X: 12.4667 Y: 0 0:231 1: 110 2: 80 3: 514:49 5:52 6: 132 7: 137 8: 136 9: 99 10: 124 11: 50 12:49 13: 50 14: 214 15: 177 Best Feature Matches: 1 00 corridor 75 alcove 66.6667 four way Sun Feb 7 12:30:45 1999: SonarData collected Time: 918415845.54179(! X: 16.9Y: 0 0: 224 1: 113 2: 77 3: 514: 50 5: 51 6: 116 7: 187 8: 159 9: 103 10: 140 11: 53 12:49 13: 50 14: 220 15: 147 227

PAGE 241

Sun Feb 7 12:30:46 1999: SonarData collected Time: 918415846.191783 X: 18.5 Y: 0 0: 225 1: 112 2: 76 3: 51 4: 50 5: 51 6 : 202 7: 156 8: 142 9: 105 10: 140 11: 53 12: 49 13: 50 14: 217 15: 185 Sun Feb 7 12:30:46 1999: SonarData collected Time: 918415846.641771 X: 19.5 Y: 0 0: 227 1: 110 2: 74 3: 51 4: 50 5: 51 6 : 202 7: 156 8: 142 9: 105 10: 141 11: 52 12: 49 13: 50 14 : 218 15: 186 Sun Feb 7 12:30:47 1999: SonarData collected Time: 9184!"5847.101799 X: 20.8 Y: 0 0: 227 1: 110 2: 74 3: 51 4: 50 5: 51 6: 188 7: 142 8: 144 9: 106 10: 142 11: 50 12: 49 13: 50 14: 218 15: 186 Sun Feb 7 12:30:47 1999 : SonarData collected Time: 918415847.721782 X: 22.3 Y: 0 0 :2141: 109 2: 73 3: 514: 50 5:51 6 : 142 7: 148 8: 145 9: 108 10: 145 11: 50 12:49 13:50 14:21615: 178 Sun Feb 7 12:30:48 1999 : SonarData collected Time: 918415848.121780 X: 23.3 Y: 0 0: 223 1: 108 2: 72 3: 51 4: 50 5 : 51 6 : 142 7: 148 8: 145 9: 108 10: 145 11: 50 12:49 13: 50 14: 20115: 189 Sun Feb 7 12:30:48 1999: Sonarlmage created Time: 918415848.122776 X: 20.2167 Y : 0 0: 223 1: 110 2: 74 3: 51 4: 50 5: 51 6: 165 7: 156 8: 146 9: 105 10: 142 11: 5112:49 13: 50 14: 215 15: 178 Sun Feb 7 12:30:48 1999: Feature Type: corridor Start: 4.9 End: 20.2167 Sonarlmages : Time: 918415842.62862 X: 4.9Y: 0 0 : 224 1: 122 2: 86 3: 51 4: 49 5: 74 6: 146 7: 145 8: 165 9: 97 10: 94 11: 50 12: 49 13: 50 14: 204 15: 169 228

PAGE 242

Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415845.92774 X: 12.4667 Y: 0 0:2311:.110 2:80 3:514:49 5:52 6: 132 7: 137 8: 136 9:,9910: 12411: 50 12:4913:50 14:21415: 177 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415848.122776 X: 20.2167 Y: 0 0: 223 1: 110 2: 74 3: 51 4: 50 5: 51 6: 165 7: 156 8: 146 9: 105 10: 142 11: 51 12: 49 13: 50 14:215 15: 178 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Sun Feb 7 12:30:48 1999: SonarData collected Time: 918415848.531778 X: 24.3 Y: 0 0: 223 1: 108 2: 72 3: 51 4: 50 5: 54 6: 185 7: 159 8: 146 9: 109 10: 146 11: 50 12: 49 13: 50 14: 201 15: 189 Sun Feb 7 12:30:49 1999: SonarData collected Time: 918415849.161774 X: 25.8 Y: 0 0: 226 1: 106 2: 71 3: 51 4: 50 5: 53 6: 194 7: 149 8: 148 9: Ill 10: 146 11: 50 12: 49 13: 50 14: 206 15: 204 Sun Feb 7 12:30:49 1999: SonarData collected Time: 918415849.581782 X: 26.8 Y: 0 0:2261: 113 2: 71 3:514:50 5: 53 6: 194 7: 149 8: 148 9 : 11110: 149 11:50 12:48 13: 50 14: 213 15: 186 Sun Feb 7 12:30:50 1999: SonarData collected Time: 918415850.51788 229

PAGE 243

X: 28Y: 0 0:2261: 113 2:69 3:514:50 5:57 6: 187 7:2218: 149 9:11210:15011:5012:4813:50 14: 213 15: 186 Sun Feb 7 12:30:50 1999: SonarData collected Time: 918415850.751786 X: 29.6 Y: 0 0: 210 1: 133 2: 68 3: 51 4: 50 5: 56 6: 211 7: 141 8: 150 9: 114 10: 15011: 50 12: 48 13:49 14: 209 15: 182 Sun Feb 7 12:30:51 1999: SonarData collected Time: 918415851.171789 X: 30.8 Y: 0 0:229 1: 138 2: 67 3:514: 50 5: 56 6:211 7: 141 8: 150 9: 114 10: 117 11:50 12:48 13:49 14: 204 15: 183 Sun Feb 7 12:30:51 1999: Sonarlmage created Time: 918415851.172788 X: 27.55 Y: 0 0: 223 1: 118 2: 69 3: 51 4: 50 5: 54 6: 197 7: 160 8: 148 9:.111 10: 143 11: 50 12:48 13:49 14: 207 15: 188 Sun Feb 7 12:30:51 1999: Feature Type: corridor Start: 4.9 End: 27.55 Sonar !mages: Time: 918415842.62862 X: 4.9Y: 0 0: 224 1: 122 2: 86 3: 514:49 5: 74 6: 146 7: 145 8: 165 9: 97 10: 94 11: 50 12:49 13: 50 14: 204 15: 169 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415845.92774 X: 12.4667 Y: 0 0:231 1: 110 2: 80 3:51 4:49 5: 52 6: 132 7: 137 8: 136 9: 9910: 124 11: 50 12:49 13: 50 14:21415: 177 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way 230

PAGE 244

Time: 918415848.122776 X: 20.2167 Y : 0 0: 223 1 : 110 2 : 74 3: 51 4: 50 5: 51 6: 165 7: 156 8: 146 9: 105 10: 142 11: 51 12: 49 13: 50 14:215 15: 178 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time : 91-8415851.172788 X: 27.55 Y : 0 0:223 1: 118 2: 69 3: 514:50 5: 54 6: 197 7: 160 8 : 148 9: 111 10: 143 11: 50 12:48 13:49 14: 207 15: 188 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Sun Feb 7 12:30:53 1999: SonarData collected Time: 918415853.231782 X: 35.3 Y: 0 0:2201: 135 2 : 66 3:514:50 5:52 6: 192 7 : 154 8: 156 9: 11910: 163 11:5012:4813:49 14:19915: 179 Sun Feb 7 12:30:53 1999: SonarData collected Time : 918415853.631797 X: 36.9Y: 0 0:223 1: 133 2: 64 3: 514: 50 5: 52 6: 185 7 : 130 8: 156 9: 119 10: 163 11:50 12:48 13:49 14:20115:176 Sun Feb 7 12: 30:54 1999 : SonarData collected Time: 918415854.41784 X: 37.8 Y : 0 0 : 220 1 : 133 2 : 64 3: 514: 50 5: 52 6: 185 7 : 130 8: 156 9: 120 10: 135 11: 53 12:48 13: 49 14: 189 15: 174 Sun Feb 7 12:30:54 1999: SonarData collected Time: 918415854.661772 X: 38.8 Y: 0 0:220 1: 131 2 : 74 3:514:50 5:51 6:215 7: 154 8: 160 9: 122 10: 135 11:53 12:48 13:49 14: 189 15: 174 Sun Feb 7 12: 30:55 1999: Sonar Data collected 231

PAGE 245

Time: 918415855.101782 X: 40.4 Y: 0 0:223 1: 130 2:74 3:514: 50 5: 52 6: 185 7: 195 8: 160 9: 122 10: 133 11: 11112:48 13: 49 14: 190 15: 154 Sun Feb 7 12:30:55 1999: SonarData collected Time: 918415855.521793 X: 41.5 Y: 0 0: 223 1: 130 2: 74 3: 51 4: 50 5: 52 6: 185 7: 195 8: 162 9: 124 10: 175 11: 50 12:48 13: 50 14: 224 15: 171 Sun Feb 7 12:30:55 1999: Sonarlmage created Time: 918415855.522787 X: 38.45 Y: 0 0:221 1: 132 2: 69 3:514:50 5:51 6: 191 7 : 159 8: 158 9 : 121 10: 150 11:6112:48 13:49 14: 198 15: 171 Sun Feb 7 12:30:55 1999: Feature Type: corridor Start: 4 9 End: 38.45 Sonar Images: Time: 918415842.62862 X: 4.9Y: 0 0 : 224 1: 122 2: 86 3 : 51 4: 49 5: 74 6: 146 7 : 145 8: 165 9 : 97 10: 94 11: 50 12: 49 13: 50 14: 204 15: 169 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415845.92774 X : 12.4667 Y: 0 0:231 1: 110 2 : 80 3:514:49 5 : 52 6: 132 7: 137 8: 136 9:99 10: 12411:50 12:49 13:50 14:214 15: 177 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415848.122776 X: 20.2167 Y: 0 0: 223 1: 110 2: 74 3: 51 4: 50 5 : 51 6: 165 7: 156 8 : 146 9: 105 10: 142 11: 51 12:49 13: 50 14: 215 15: 178 Best Feature Matches : 232

PAGE 246

100 corridor 75 alcove 66.6667 four way Time: 918415851.172788 X: 27.55 Y: 0 0:223 1: 118 2: 69 3: 51 4: 50 5: 54 6: 197 7: 160 8: 148 9: 111 10: 143 11: 50 12:48 13:49 14: 207 15: 188 Best Feature Matches: 1 00 corridor 75 alcove 66.6667 four way Time: 918415855.522787 X: 38.45 Y: 0 0: 221 1: 132 2: 69 3: 51 4: 50 5: 51 6: 191 7: 159 8: 158 9: 121 10: 150 11: 61 12:48 13: 49 14: 198 15: 171 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Sun Feb 7 12:30:56 1999: SonarData collected Time: 918415856.171779 X: 43.1 Y: 0 0: 254 1: 128 2: 74 3: 51 4: 50 5: 54 6: 196 7: 162 8: 163 9: 125 10: 128 11:49 12:48 13: 50 14: 202 15: 171 Sun Feb 7 12:30:56 1999: SonarData collected Time: 918415856.621772 X: 44.1 Y: 0 0:230 1: 126 2:73 3:514:50 5:59 6:205 7: 162 8: 163 9: 125 10: 128 11:49 12:48 13:50 14: 202 15: 169 Sun Feb 7 12:30:57 1999: SonarData collected Time: 918415857.61773 X: 45.3 Y: 0 0: 230 1: 126 2: 73 3: 51 4: 50 5: 59 6: 205 7: 190 8: 165 9: 126 10: 184 11: 49 12: 13:49 14: 228 15: 166 Sun Feb 7 12:30:57 1999: SonarData collected Time: 918415857.691792 X:46.9Y: 0 233

PAGE 247

0: 250 1: 124 2 : 72 3: 51 4: 50 5: 56 6: 193 7: 148 8: 166 9: 128 10: 165 11:49 12:48 13: 49 14: 228 15: 166 Sun Feb 7 12:30:58 1999: SonarData collected Time: 918415858.121797 X:47.9Y: 0 0: 222 1: 123 2: 75 3: 51 4: 50 5: 58 6: 73 7: 156 8: 166 9: 128 10: 165 11: 49 12:48 13: 49 14: 229 15: 165 Sun Feb 7 12:30:58 1999: SonarData collected Time: 918415858.531793 X: 48.9 Y: 0 0 : 222 1: 123 2 :75 3:51 4:50 5: 58 6:73 7: 156 8: 164 9: 128 10: 75 11:49 12:48 13:49 14: 200 15: 151 Sun Feb 7 12:30:58 1999: Sonarlmage created Time: 918415858.532783 X: 46.0333 Y: 0 0: 234 1: 125 2: 73 3: 514: 50 5: 57 6: 157 7: 162 8: 164 9: 126 10: 140 11: 49 12:48 13: 49 14: 214 15: 164 Sun Feb 7 12:30:58 1999: Feature-Type: corridor Start: 4.9 End: 46 0333 Sonarlmages: Time: 918415842.62862 X: 4.9Y: 0 0:2241: 122 2: 86 3: 514:49 5:74 6: 146 7: 145 8: 165 9: 97 10:94 11:50 12:49 13:50 14: 204 15: 169 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415845.92774 X: 12.4667 Y: 0 0:231 1: 110 2: 80 3:514:49 5:52 6 : 132 7: 137 8: 136 9:99 10: 124 11:50 12:49 13:50 14:21415: 177 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415848.122776 234

PAGE 248

X: 20.2167 Y: 0 0:223 1: 110 2:74 3:51 4: 50 5:51 6 : 165 7: 156 8: 146 9: 105 10: 142 11:51 12:49 13:50 14: 215 15: 178 Best Feature Matches: 100 corridor 75 alcove 66 6667 four way Time: 918415851.172788 X: 27.55 Y: 0 0:2231:1182:693:514:505:546:1977:1608:1489:11110:14311:5012:4813:49 14: 207 15: 188 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415855.522787 X: 38.45 Y: 0 0:221 1: 132 2:69 3: 514:50 5:51 6: 191 7: 159 8: 158 9: 121 10: 150 11: 61 12:48 13:49 14: 198 15: 171 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415858.532783 X: 46.0333 Y: 0 0:234 1: 125 2:73 3:514:50 5:57 6: 157 7: 162 8: 164 9: 12610: 140 -11:49 12:48 13:49 14: 214 15: 164 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Sun Feb 7 12:30:59 1999: SonarData collected Time: 918415859.151772 X: 50.4 Y: 0 0:2461: 121 2:76 3: 514:50 5:57 6: 187 7: 136 8: 170 9: 116 10: 135 11:49 12:48 13:49 14: 200 15: 151 Sun Feb 7 12:30:59 1999: SonarData collected Time: 918415859.551775 235

PAGE 249

X: 51.4 Y: 0 0:211 1 : 119 2:83 3:514:50 5:52 6: 189 7: 136 8: 170 9 : 11610: 135 11:4912:4813: 49 14: 75 15: 161 Sun Feb 7 12:30:59 1999: Soi').arData collected Time: 918415859.971787 X : 52.4 Y: 0 0:2111: .}19 2:83 3:514:50 5:52 6:189 7: 197 8: 172 9:,13410:21711:5012:4813:49 14: 220 15: 161 Sun Feb 7 12:31:00 1999: SonarData collected Time: 918415860.581792 X : 53.9Y: 0 0: 219 1: 118 2: 72 3: 51 4 : 50 5 : 52 6: 174 7: 148 8: 174 9: 129 10: 163 11: 50 12: 48 13: 49 14: 220 15: 160 Sun Feb 7 12:31:00 1999: SonarData collected Time: 918415860.991798 X: 54 .9Y: 0 0: 240 1: 116 2: 74 3 : 51 4: 50 5: 52 6: 174 7: 148 8: 174 9 : 129 10: 163 11: 50 12: 48 13: 49 14: 195 15: 158 Sun Feb 7 12:31:01 1999 : SonarData collected Time: 918415861.391785 X: 55.9Y: 0 0: 240 1: 116 2: 74 3: 51 4: 50 5 : 52 6: 190 7 : 151 8: 159 9: 138 10: 150 11: 50 12: 48 13: 49 14: 195 15: 158 Sun Feb 712:31:011999: Sonarlmagecreated Time : 918415861.392786 X: 53.15 Y: 0 0:227 1 : 118 2 : 77 3:514:50 5:52 6: 183 7: 152 8: 169 9: 12710: 160 11:49 12:48 13: 49 14: 184 15: 158 Sun Feb 7 12:31:01 1999: Feature Type: corridor Start: 4.9 End: 53.15 Sonar Images: Time : 918415842.62862 X: 4 .9Y: 0 0: 224 1 : 122 2: 86 3: 51 4: 49 5: 74 6: 146 7 : 145 8: 165 9 : 97 10: 94 11: 50 12:49 13: 50 14: 204 15: 169 Best Feature Matches: 100 corridor 236

PAGE 250

75 alcove 66.6667 four way Time: 918415845.92774 X: 12.4667 Y: 0 0:231 1: 110 2:80 3:514:49 5:52 6: 132 7: 137 8: 136 9:9910: 12411:50 12:4913:50 14: 214 15: 177 Best Feature Matches: 1 00 corridor 75 alcove 66.6667 four way Time: 918415848.122776 X: 20.2167 Y: 0 0:223 1: 110 2:74 3: 514:50 5:51 6: 165 7: 156 8: 146 9: 105 10: 142 11: 51 12:49 13: 50 14:215 15: 178 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415 851.172788 X: 27.55 Y: 0 0: 223 1: 118 2: 69 3: 51 4: 50 5: 54 6: 197 7: 160 8: 148 9: 111 10: 143 11: 50 12:48 13: 49 14: 207 15: 188 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Time: 918415855,522787 X: 38.45 Y: 0 0: 221 1: 132 2: 69 3: 51 4: 50 5: 51 6: 191 7: 159 8: 158 9: 121 10: 150 11: 61 12: 48 13: 49 14: 198 15: 171 Best Feature Matches: 1 00 corridor 75 alcove 66.6667 four way Time: 918415858.532783 X: 46.0333 Y: 0 0: 234 1: 125 2: 73 3: 51 4: 50 5: 57 6: 157 7: 162 8: 164 9: 126 10: 140 11: 49 12: 48 13:49 14: 214 15: 164 Best Feature Matches: 237

PAGE 251

100 corridor 75 alcove 66.6667 four way Time: 918415861.392786 X: 53.15 Y: 0 0 : 227 1: 118 2 : 77 3:514:50 5:52 6: 183 7: 152 8 : 169 9: 12710: 160 11:4912:48 13:49 14: 184 15: 158 Best Feature Matches: 100 corridor 75 alcove 66.6667 four way Sun Feb 712:31:021999: SonarDatacollected Time: 918415862.11781 X: 57.4 Y: 0 0:223 1: 114 2:73 3:514:50 5:52 6: 179 7: 156 8: 178 9: 132 10: 150 11:50 12:48 13:49 14: 223 15: 157 Sun Feb 7 12:31:02 1999: SonarData collected Time: 918415862.421869 X: 58.5 Y: 0 0: 225 1: 112 2: 73 3: 51 4: 50 5: 52 6: 179 7: 156 8: 178 9: 132 1.0: 214 fl: 49 12: 48 13: 49 14: 214 15: 144 Sun Feb 7 12:31:02 1999: SonarData collected Time: 918415862.821786 X: 59.4 Y: 0 0:225 1: 112 2:73 3:514: 50 5:52 6: 155 7: 128 8: 180 9: 135 10: 147 11:49 12:48 13:49 14: 214 15: 144 End log: Sun Feb 7 12:31:03 1999 238

PAGE 252

Debug Log Start log: Sun Feb 7 12:30:37 1999 Sun Feb 7 12:30:37 1999: Reading in configuration data Trans Speed is: 25 Travel Distance is: 600 Sonar Firing Rate is: 25 Corridor Aspect Ratio is: 3 Readings per Sonarimage is: 6 Comers to Match are: 6 Min Range to Obstacles is: 18 Short Threshold is: 54 Long Threshold is: 78 Reading Distance is: 5 Sun Feb 7 12:30:37 1999: Initializing patterns Sun Feb 7 12:30:37 1999: Initializing robot Sun Feb 7 12:30:37 1999: Entering explore Sun Feb 712:31:031999: Shutting Down Robot End log: Sun Feb 7 12:31:03 1999 239

PAGE 253

Configuration File robot.cfg #translational speed for robot (0.1 inch/sec) 25 #distance for robot to travel (0, 1 inches) 600 #sonar firing rate 25 #corridor aspect ratio (ratio oflength to width of sensor readings) 3 #readings (SonarData) per Sonarlmage 6 #number of readings that have to be true 'for FWCornerRule to be true 6 #minimum range (in inches) to an obstacle before robot tries to avoid 18 #short threshold (in inches) for rules 54 #long threshold (in inches) for rules 78 #distance between sonar readings in 0.1 inches 5 240

PAGE 254

References [1] Barshan, B., and Kuc,"R., "Differentiating Sonar Reflections from Corners and Planes by Employing an Intelligent Sensor", IEEE Transactions on Pattern Analysis and Machine Intelligence, 12(6):560-569, June, 1990. [2] Borenstein, J., and Koren, Y., "Histogramic In-motion Mapping for Mobile Robot Obstacle Avoidance", IEEE Transactions on Robotics and Automation, 7(4):535-539, August, 1991. [3] Brown, M. K., "Feature Extraction Teclmiques for Recognizing Solid Objects with an Ultrasonic Range Sensor", IEEE Journal of Robotics and Automation, RA-1(4):191-205, December, 1985. [4] Bulata, H., and Devy, M., "Incremental Construction of a Landmark-based and Topological Model of Indoor Environments by a Mobile Robot", Proceedings of the 1996 IEEE International Conference on Robotics and Automation, 1054-1060, April, 1996. [5] Chong, K. S., and Kleeman, L., "Sonar Based Map Building for a Mobile Robot", Proceedings of the 1997 IEEE International Conference on Robotics and Automation, 1700-1705, April, 1997. [6] Dixon, J., and Henlich, 0., "Mobile Robot Navigation-Final Report", Surveys and Presentations in Information Systems Engineering (SURPRISE) 97, hnperial College, 1997. [7] Dudek, G., "Environment Representation Using Multiple Abstraction Levels", Proceedings of the IEEE, 84(11 ): 1684-1704, November, 1996. [8] Elfes, A., "Sonar-Based Real-World Mapping and Navigation", IEEE Journal of Robotics and Automation, RA-3(3):249-265, June, 1987. 241

PAGE 255

[9] Horst, J. A., "Maintaining multi-level planar maps in intelligent systems", Proceedings of the 1996 IEEE International Conference on Robotics and Automation, 1061-1066, April, 1996. [10] Howard, A., and Kitchen, L., "Sonar Mapping for Mobile Robots", Technical Report 96134, Department of Computer Science, University ofMelbourne, November, 1996. [11] Janet, J. A., Scoggins, S.M., White, M. W., Sutton, ill, J. C., Grant, E., and Snyder, W. E., "Using a Hyper-Ellipsoid Clustering Kohonen for Autonomous Mobile Robot Map Building, Place Recognition and Motion Planning", Proceedings of the 1997 IEEE International Conference on Neural Networks, 1699-1704,June, 1997. [12] Jung, D., and Gupta, K., "Octree-Based Hierarchical Distance Maps for Collision Detection", Journal of Robotic Systems, 14(11):789-806, November, 1997. [13] Kleeman, L., and Kuc, R., "Mobile Robot Sonar for Target Localization and Classification", International Journal of Robotics Research, 14( 4):295-318, 1995. [14] Kunz, C., Willeke, T., and Nourbakhsh, I. R., "Automatic Mapping of Dynamic Office Environments", Proceedings of the 1997 IEEE International Conference on Robotics and Automation, 1681-1687, April, 1997. [15] Lacroix, S., and Dudek, G., "On The Identification of Sonar Features.", Proceedings of the IEEEIRSJ International Conference on Intelligent Robots and Systems, 586-593, September, 1997. [16] Lim, J. H., and Cho, D. W., "Physically Based Sensor Modeling for a Sonar Map in a Specular Environment", Proceedings of the 1992 IEEE International Conference on Robotics and Automation, 1714-1719, May, 1992. [17] Moravec, H.P., "Sensor Fusion in Certainty Grids for Mobile Robots" AI Magazine, 9(2):61-74, Summer, 1988. 242

PAGE 256

[18] Murphy, R. R., Gomes, K., and Hershberger, D., "Ultrasonic Data Fusion as a Function of Robot Velocity", 1996 SPIE Sensor Fusion and Distributed Robotic Agents, 114-126, November, 1996. [19] Nebot, E.M., and Pagac, D., "Quadtree Representation and Ultrasonic Information for Mapping an Autonomous Guided Vehicle's Environment", International Journal of Computers and their Applit;ations, 2(3): 160-170, December, 1995. [20] Nomad 200 Hardware Manual, Nomadic Technologies, Mountain View, CA, 1997. [21] User's Manual, Software Version 2.6., Nomadic Technologies, Mountain View, CA, 1997. [22] Ohya, A., Nagashima, Y., and Yuta, S., "Exploring Unlmown Environment and Map Construction Using Ultrasonic Sensing of Normal Direction of Walls", Proceedings of the 1994 IEEE International Conference on Robotics and Automation, 485-492, May, 1994. [23] Pagac, D., Nebot, E. M., and DurrantWhite, H., "An evidential approach to probabilistic map-building", Proceedings of the 1996 IEEE International Conference on Robotics and Automation, 745-750, April, 1996. [24] Politis, Z., and Probert, P., "Perception of an Indoor Robot Workspace by Using CTFM Sonar Imaging", Proceedings of the 1998 IEEE International Conference on Robotics and Automation, 2801-2806 May, 1998. [25] Smith, C. M., Leonard, J. J., Bennett, A. A., and Shaw, C., "Feature-Based Concurrent Mapping and Localization for AUVs", Proceedings of Oceans 97, 896-901, October, 1997. [26] Thrun, S., and Bucken, A., "Integrating Grid-Based and Topological Maps for Mobile Robot Navigation", Proceedings of the Thirteenth AAAI National Conference on Artificial Intelligence, 944-951, August, 1996. 243