University of Pennsylvania



University of Pennsylvania

Moore School of Electrical Engineering

EE 441 / 442 Senior Design

Coordinated Chemical Sensing System

using Sony AIBO

Raymond Ciarcia, Brian Corwin, Paul Vernaza

April 29, 2004

Advisor: Dr. Daniel Lee

Email: ddlee@seas.upenn.edu

Coordinated Chemical Sensing System using Sony AIBO

University of Pennsylvania, Moore School of Electrical Engineering

Raymond Ciarcia, Brian Corwin, Paul Vernaza

The purpose of the coordinated chemical sensing system is to allow a user to scan a room for the presence of a dangerous chemical, without having to physically be in the room, preventing human exposure to a possibly dangerous environment. The best way to give the user this capability is to use a group of semi-autonomous mobile robots. Even though this is not their typical role, the Sony AIBO robotic dogs are a good choice given the substantial base software and sensory hardware they provide.

Each SonyAIBO is equipped with chemical sensors to search in an enclosed room (no larger than 2000 square feet) containing limited obstacles, for potential source(s) of an airborne chemical (like a gas leak or chemical spill) to within an accuracy of ±1 foot. Several dogs scan the room in a coordinated search pattern, overseen by a user at a nearby PC, sending back chemical signal values that are used to create a map of chemical signal strength throughout the room. Localization within the room is provided by two main systems (auditory and visual localization) to within accuracy of ±1 foot. Each AIBO is able to broadly scan at a rate of 12 square feet a minute. The AIBOs investigate strong signals with a tighter search in order to more accurately determine the epicenter of a source to within ±1 foot.

Table of Contents

1 MOTIVATION / GOALS 1

2 INTRODUCTION 1

2.1 CHEMICAL SENSING 2

2.2 LOCALIZATION 2

2.3 INTEGRATION 2

3 DISCUSSION OF PREVIOUS WORK 2

3.1 “STEREO-OLFACTION” 2

3.2 “GAS DISTRIBUTION IN UNVENTILATED INDOOR ENVIRONMENTS” 3

3.3 “ROBOCUP” 3

4 STRATEGIC PLAN 4

4.1 SYSTEM PROPOSED / THEORY OF OPERATION 4

4.2 SYSTEM SPECIFICATIONS 8

4.3 HARDWARE AND SOFTWARE REQUIREMENTS 8

4.4 TEST AND DEMONSTRATION 13

4.5 SCHEDULE 15

5 RESULTS 16

5.1 SENSOR INVESTIGATION 16

5.2 VISUAL LOCALIZATION 21

6 LESSONS LEARNED 23

7 EQUIPMENT / FABRICATION NEEDS 24

8 CONCLUSIONS AND RECOMMENDATIONS 24

9 NOMENCLATURE 24

10 REFERENCES 24

11 BIBLIOGRAPHY 25

12 FINANCIAL INFORMATION 25

13 ETHICS 26

14 SOFTWARE DOCUMENTATION 26

1 MOTIVATION / GOALS

Recently, the United States has had to face the reality that terrorist attacks on its soil are a real possibility that require our constant vigilance and preparedness. One such eventuality that must be considered is the possibility of an attack using chemical or biological weapons. It is frightening but not impossible to imagine, for example, that a terrorist might place a dangerous chemical gas source in the center of a crowded building or subway station, particularly near a ventilation mechanism that could diffuse the chemical into the air over a large radius.

These scenarios, in addition to accidental chemical spills or leaks that can propel toxins into the air, demonstrate the need for a method by which the presence of a chemical can be rapidly detected and the concentration mapped to help locate its source. Since these chemicals are hazardous to humans, this task is best suited for machines.

The goal of the Coordinated Chemical Sensing System (CCSS) is to offer a means by which a remote user, safely outside the chemical’s reach, can easily command a team of robotic dogs, each equipped with a chemical sensor, to conduct the search. To accomplish this task, the system will be composed of two main elements, the Sony AIBOs and a graphical user interface (GUI) located on the operator’s personal computer.

The Sony AIBO dogs will:

• Navigate the room to map the chemical’s distribution with as little human guidance as possible

• Send information (visual, chemical, location) to a remote user’s PC

The Graphical User Interface will:

• Display an estimated location of each dog

• Display chemical concentration sensed by each dog

• Allow operator to easily, directly control any dog

2 INTRODUCTION

The advantage of using robots as opposed to humans to find a chemical is clear, and the idea is one that has been receiving the attention of research groups. As one research team put it, “The detection and localization of drugs, explosives or gas leaks in hostile environments or in public places is currently a very active area of research.” [1] However, before such an ability is realized, there are many obstacles that must be overcome.

1 CHEMICAL SENSING

Many of these obstacles are centered around the chemical detection process itself. One problem, for example, is that readings from chemical sensors are not instantaneous, but rather the sensors require large sampling times and return not absolute readings but rather time-integrated results [2]. Even if a highly accurate sensor could be found, even the slightest turbulence in a region where the chemical has dispersed can create a confusing environment for the potential localization system, where peaks in chemical concentration may not necessarily imply proximity to the source. [2]

2 LOCALIZATION

Robot localization, in other words knowing where a robot is located after it has been moving for a while, is also very problematic. Currently, robots that need accurate location information rely on a carefully predefined environment. For example, the Sony AIBO dogs used in the Robocup soccer competition rely on the ability of the dog to see very specific lines and colored markers on and around the tournament soccer field. [3] Once these visual cues or other specific parameters are removed, localization of a robot becomes much more difficult.

3 INTEGRATION

Because a great deal of time and effort and far more resources than are available here have already gone into investigating these problems individually, the aim of this project is not to provide new and improved sensors or localization tools. Instead, the goal is to integrate currently available sensor and localization technologies, as well as smart teamwork among the robots, to create an actual tool for the purpose of aiding in the discovery of a chemical leak or spill. While the focus remains on autonomy of the robot team, to address of the limitations of current technology some control and decision-making will inevitably be left to a human user. In this manner, algorithmic solutions, in conjunction with human input, can be used to bridge the gap between existing research projects and a system that can be used in a more practical setting.

3 DISCUSSION OF PREVIOUS WORK

1 “STEREO-OLFACTION”

This study by Olivier Rochel et al [1] was aimed at designing a robot that would “be able to go toward an odor source” [1]. As they saw it, there were two key tasks involved in doing so: “1) estimation of the gas-concentration gradient and 2) gas recognition independent of concentration.” [1] To address the first task, they proposed using “stereo-olfaction,” noting that in animals “a small difference of 10 percent in concentrations between the two olfactory sensors…is enough to estimate the gas concentration gradient.” [1] Thus, their robot contained two bank of sensors, one on each side of vehicle. The second was accomplished using different types of chemical sensors and comparing the measurements from each. To minimize the effect of turbulence, the group also found it necessary to “sample the surrounding atmosphere” [1] via an inlet pipe that led to the otherwise encased sensors. At the time of the paper, the ability of the robot to localize the gas source was still being tested.

2 “GAS DISTRIBUTION IN UNVENTILATED INDOOR ENVIRONMENTS”

Michael Wandel et al [2] focused on the problem of odor source localization in an unventilated source, where “propagation of gases… is dominated by turbulence.” [2] Aside from creating an artificial air current, the group hoped to find an algorithm that could address the difficulties presented by such turbulence. In two similar tests, a single robot drove in a predefined rectangular spiral around the chemical epicenter located in the middle of a room, creating a map of chemical concentration around the source. Strikingly, the group concluded that “the biggest concentration was never measured close to the source.” [2] Instead, the highest concentration was found near the walls. The paper attributes this to the fact that windows in the wall were heated by sunlight and thus created air circulation near the walls. As such, and given the fact that results from both tests were very similar, the group did believe that it might be possible to develop a localization algorithm, but no such algorithm had been written at the time of the paper.

The study also noted a difficulty with the chemical sensor in that it didn’t provide an absolute measurement of concentration but rather, due to long sensor recovery times, it introduced, “considerable temporal integration of successive readings.” [2]

Tracking of the location of the robot along the predefined path was accomplished in one test by using a laser to measure the distance to the closest wall, which assumed prior knowledge of the room dimensions. The second test used overhead cameras to track the vehicle. In both tests, the paths were created with full knowledge of the room dimensions and the location of the chemical source.

3 “ROBOCUP”

Much work has been done by David Cohen [3] et all regarding the Sony AIBO robot. One area of particular interest is the group’s work to embed a Perl interpreter in the dogs. This allows for “rapid development” [3] using high level Perl commands rather than C++. The group also created an interface by which a Matlab program running on a remote computer can “send and receive matrices over the wireless network” [3] to and from the dogs, thus allowing for easy exchange of large amounts of data.

To form a probabilistic estimate of the location of each dog, the group merged an estimate of distance traveled according to the odometry system in the dog with visual cues provided by the soccer field. Object avoidance was accomplished by programming the dog to generate a potential field based upon its surroundings. The field then guides the dog toward free paths and away from undesirable paths. As each dog had a choice of one of three possible behaviors (e.g. Attack, Defend, Support), simple teamwork consisted of communication among the dogs to prevent too many of the dogs from entering the same behavior mode.

4 STRATEGIC PLAN

The project can be divided into two basic units. The first unit is the Sony ERS210A AIBO robotic dog. This unit consists of a standard AIBO dog (which includes a camera and infrared proximity detector) in addition to the components with which it will be outfitted, including sensors and custom software loaded onto the AIBO. The second unit is the PC program that resides on the operator’s computer that allows for high level coordination of the dogs and provides the user with a graphical representation of the data collected by the dogs as well as control over the team.

1 SYSTEM PROPOSED / THEORY OF OPERATION

As shown in Figure 1, the system includes multiple Sony AIBO robotic dogs, equipped with sensors, a sensor-to-serial bridge, and a visual and auditory localization scheme, that can communicate with one another and an operator using a GUI control program on a PC. To get sensor information to the AIBO, a sensor-to-serial bridge is needed to convert the analog output of the sensor into digital format that can be transferred to the AIBO via its serial communications interface. Information from the sensor-to-serial bridge, as well as camera and infrared range finder data, are transmitted to the operator’s PC via an 802.11b wireless Ethernet. This information is displayed in the operator’s program in an integrated fashion. Software in the individual AIBOs and on the operator’s PC coordinates the actions of the AIBO team.

[pic]

Figure 1 – Coordinated Chemical Sensing System high-level overview

1 Sensors / Search Design

Tests of the TGS 2620 ethanol sensor indicate reasonably accurate detection of an alcohol source is possible (see Section 5.1.1). The tests indicate that the best way to detect the source of alcohol is to allow the AIBO to sit at a position for a specified amount of time (detection period) and see what the average and peak signal voltages from the TGS 2620 sensor were for that period. The peak is the more important value, and if it is above a certain threshold then there is likely a source nearby. If there is a high average as well then this gives more assurance that there is a source nearby (see Section 5.1.1). Because of this both the average and the peak will be used to find the source.

The tests show that alcohol will be detected if it is present (i.e. no false-negatives), but may not give accurate position of the source (i.e. false positives can occur). This means that the search will require some level of redundancy, like one AIBO double-checking areas covered by another AIBO. Other methods, such as using an isolation box and suction mechanism for sampling the air may be used similar to the technique used in a previous study [1] to allow greater environment control during sampling.

2 Localization

The ultimate goal is for the dogs to take a room and figure out for themselves how to divide up the mapping task and perform it without any human intervention. Key to this goal is robot localization.

1 Visual Localization

The next best thing to setting up visual cues in a room to help robots localize themselves is to have the robots bring the markers with them. Each robot dog will be equipped with a uniquely colored ball mounted above its back by which other dogs can visually recognize it. Although all dogs carry such a marker, two dogs are designated “anchor dogs,” while all others are designated “search dogs.” The anchor dogs enter the room first and move parallel to the wall containing the doorway through which they entered until each is some distance d/2 from the point of entry. Audio localization, described in section 4.1.2.2, can be used to verify d, the distance between the dogs.

Once these anchor dogs are in place, search dogs can enter the room. To determine the location of a search dog, an anchor dog will center the other anchor dog in its field of vision and then center the search dog in its field of vision. By recording the position of its head in each case, it now knows the angle between the other anchor and the search dog. The other anchor dog does the same, providing a second angle. Since the distance between the two anchors is already known as d, two angles and a side of a triangle are known and provide sufficient information to localize the search dog. Figure 2 illustrates this scheme.

[pic]

Figure 2 - Visual localization

Based on a coordinate system with its origin halfway between the two anchor dogs, the location of the search dog is given as:

(x,y) = [pic]

For verification, a search dog can also when necessary turn to face the anchor dogs and determine the angle between each anchor to provide an additional estimation of its location.

2 Auditory Localization

A second method of localization that can compliment the first is the use of auditory signals. Dr. Lee has developed a system by which a dog can emit a typical phone touch-tone followed by a “locking” signal. A second dog in hearing range can identify the touch-tome emitted and determine the time the tone was received by analyzing the locking signal. This capability can be used to determine how far apart two dogs are. One dog initiates by sending a sound signal and recording the time at which the signal was sent, called T1A. The second dog records the time the sound was heard as T2A. If the clocks of the dogs were synchronized, the distance between the dogs could be computed simply as (T2A – T1A) / speed of sound. However, this cannot be assumed, and so a second step is needed. The second dog responds by generating its own tone at T2B, which is received by the first dog at T1B. For an illustration, see

[pic]

Figure 3 - Auditory localization

Thus, the distance can be calculated independent of clock differences as:

d = [(T2A – T1A) – (T2B – T2B)] / (2 x speed of sound)

Although this does not provide an angle, the distance measurement can be used in conjunction with visual localization to verify the location of a dog.

3 Coordination

A final design consideration is the coordination of the AIBO dogs. Though particular search patterns will be designed and optimized in response to continued sensor and localization considerations, a few basic tenants can be laid out. First of all, because of the risk of false positives in detecting chemicals, it is clear that any search pattern include the ability for more than one dog to work together to investigate a potential chemical source. Furthermore, because of the range limitations imposed by the use of visual localization, special coordination will be necessary to extend the area over which a search can be conducted. For example, a search dog that is out of visual range of the anchor dogs should be able to request that a localization estimate for its position be performed by two other search dogs that do know their locations based on the anchor dogs. Similarly, when the area bounded by the visual range limits has already been searched but the room extends further, the anchor dogs can move themselves to the end of the searched zone and begin the process anew. While such procedures will introduce considerably more error, in cases where no other localization is otherwise possible, a rough estimate may prove more useful than none.

2 SYSTEM SPECIFICATIONS

The completed system should provide:

• Linearly scalable mapping rate with respect to the number of robots, with a minimum of three robots.

• Peak and average chemical sensor output values, for detection periods, to the user if they are beyond the given alcohol detection threshold (see Section 4.4.1).

• Mapping resolution of 1 reading per 4 square feet (2’x2’ square mapping sections used).

• Mapping rate of three sections per minute per dog (12 square feet per minute).

• Dog position localization to an accuracy of 1 foot.

• Updated information display for each new reading will in real-time (within 0.5 seconds of reading completion).

• Re-coordination of search within 15 seconds if an AIBO shuts down.

• Manual override that stops team and automatic recovery from manual override.

3 HARDWARE AND SOFTWARE REQUIREMENTS

In addition to the AIBOs and the operator’s PC, this project requires sensor hardware and hardware to interface the sensors with the AIBO dogs. Software is needed both in the dog to coordinate the flow of information from the sensors to the PC and direct the dog’s action as well as in the PC to collect and display all relevant data and control teamwork among dogs.

1 Hardware Requirements and Design Approach

Figure 4 shows all major hardware components that must be designed.

[pic]

Figure 4 - Hardware overview

1 Sensor and Power Consideration

The major pieces of hardware involved with the project are the AIBOs and the system attached to them at the serial console port to get the AIBO information from the external sensors.

In general, sensors will be connected to the A/D converter on an HC11. The chemical sensor, a TGS 2620 ethanol sensor, has as its basic measuring circuit a voltage divider [4] as seen in Figure 2. The output of the sensor is an analog voltage from 0 to +5V, fitting the HC11’s A/D converter voltage range, when VRH is VDD=5V and VRL is VSS=0V in standard operation [6]. A low pass filter will buffer the A/D converter input from the sensor in order to restrict noise. A major consideration is the power supply. The Adapt11C24DX Evaluation Board (EVB) available has a nominal current usage of ~45 mA [5] and the 68HC11E chip itself, with a 2 MHz E-Clock, uses ~150 mW of power [6]. The sensor voltage divider has power needs of ~15 mW, but the heater circuit in the senor has power requirements of ~210 mW and 45(4 mA [4]. This is far more current than the VDD pin of the EVB can source. Thus the sensor circuit cannot be supplied with power from the HC11, but instead must draw its power from the battery. The Adapt11C24DX requires 5 to 12VDC for its power supply [5], while the sensor circuit requires 5(0.2VDC [4]. The tight voltage regulation needed by the sensor and the separation of the power supply from the HC11 requires the use of special voltage regulator circuits from the battery.

2 RS-232 Microcontroller to AIBO Communication

Data from the sensors and collected by the microcontroller will be sent to the AIBO through the EVB RS-232 interface. A cable will go directly from the EVB RS-232 connector to the ribbon cable connected to the console port on the AIBO. Potentially, a “break-out box” built by Sony for the AIBO may used. The typical connection between a PC RS-232 port and the Sony AIBO is a RS-232 cable to a “break-out box” with a ribbon cable and connector to the console port on the under side of the AIBO as seen in Figure 2. This is the simplest solution, but these “break-out boxes” are in limited supply. It may become necessary to solder wires directly to the ribbon cable, instead of using the “break-out box”.

Another important factor in the RS-232 communication is the baud rate. The baud rate of the AIBO is set at 19200 bps, but the Adapt11C24DX boards have a clock circuit oscillator speed of 8 MHz, which does not support 19200 bps [6]. A clock circuit speed of 9.8304 MHz is needed achieve this baud rate [5]. Thus, it is necessary to either replace the clock circuits on the existing Adapt11C24DX boards with a 9.8304 MHz clock, or purchasing new boards.

2 Software Requirements and Design Approach

The software required for this project falls into three major categories:

• Sensor Program (located on the microcontroller): control the sampling and communication of the custom sensor data

• Planner Program (located on the dog): acquire sensor data, using it to form and execute a plan of action, and communicate it and other data over wireless Ethernet

• Display Program (located on the PC): receive data from the Planner Program and display it to the end user

The interconnection of these programs is shown in Figure 5.

[pic]

Figure 5 - Software module connectivity

1 Sensor Program

The main requirements for the Sensor Program are as follows:

• The program should periodically trigger the conversion of analog sensor data to digital form.

• Each incoming sensor datum should be processed and formatted as it arrives according to a protocol decipherable by the Planner Program.

• Each sensor datum should be formatted and sent over the serial port before the arrival of the next sensor datum.

• The program should be able to process at least 30 sensor readings per second.

The satisfaction of most of the requirements of the Sensor Program are contingent only on knowledge of the basic operation of the HC11, and hence should present no major difficulty. The protocol should be designed with efficiency in mind, but this should not be an overriding factor considering that the available communication bandwidth (19.2 KBps) is far greater than the sensor data production rate (30 Bps).

2 Planner Program

The main requirements for the Planner Program are as follows:

• The program should read formatted data from the serial port and extract from it sensor data.

• The program should combine visual and auditory data to provide an estimate of position accurate to within one foot.

• The program should periodically combine sensor data with other data (e.g., position), format it according to a protocol decipherable by the Display Program, and send it to the Display Program over wireless Ethernet.

• The program should be able to communicate with Planner Programs on other dogs via wireless Ethernet.

• The program should be able to control the motion of the dog based on sensor data and data received from other dogs.

• Motion planning should seamlessly integrate data from additional dogs on the network and make adjustments automatically in the event that dogs stop responding.

• Motion of the dog should be planned such that it does not interfere with other dogs or collide with simple obstacles and such that it maximizes total rate of coverage.

• The program should be able to receive and respond to messages sent over wireless Ethernet from the Display Program.

Localization data from the visual and auditory systems will be collected and analyzed by the Planner Program. Multiple location estimates may be run through a digital filter to provide a more stable and accurate running estimate of position. Additionally, experimentation may be used to calibrate the position estimate in order to achieve the required resolution of one foot.

Low-level motion control requirements will be satisfied by leveraging existing libraries that abstract the motion control interface to the realm of desired x-y coordinates. A motion planning module will provide target x-y coordinates given current location and location history, the current location of other dogs, and/or manual input from the Display Program.

Communication among Planner Program instances can be accomplished efficiently via UDP broadcasts, functionality provided by the OPEN-R API. Communication between the Planner Program and the Display Program could occur through TCP connections, which the OPEN-R API also provides.

3 Display Program

The main requirements for the Display Program are as follows:

• The program should receive sensor and localization data from the Planner Program and use it to update the user's display within 0.5 seconds of reception.

• The program should provide a graphical display correlating sensor and localization data.

• The program should provide high-level coordination, allowing the user to disable automatic operation, control a dog manually, and re-enable automatic operation.

Low-level communications requirements will be satisfied by using the Java sockets API. Graphics will be provided using either the Java Abstract Windowing Toolkit or the Swing API.

4 TEST AND DEMONSTRATION

1 Test

1 Sensor Program

A function generator will be attached to the analog signal input of the HC11. A sine wave ranging from 0.0V to 5.0V and with a period of one second will be generated. The HC11's serial port will be connected to the PC, and the signal given for the sensor program to start. The output of the program will then be observed on the terminal window. The output should conform to the format SENSOR = C, where C is a number that follows a sinusoidal pattern with a period of one second, or 30 readings.

After verifying the sensor program using artificial sinusoidal input, the entire sensor program and sensor hardware sub-system will be verified using tests similar to those for the sensors (see Section 5.1).

2 Planner Program

The serial port of the AIBO will be connected to the serial port of the HC11, and the input of the HC11 will be determined as described in "Testing the Sensor Program". The AIBO will be turned on and placed on the ground.

A receiving program will be set up on the PC and set up to connect over the wireless Ethernet to the AIBO in place of the Display Program. This program will provide a real-time display of the text being sent by the AIBO to the Display Program. The output will be checked for consistency with the protocol defined for communication between the Display Program and the Planner Program. Additionally, the content of the data will be checked as well; the sensor reading should follow the same sinusoidal pattern.

Localization requirements (see Section 4.2) will be tested by moving the AIBO to a sequence of fixed locations in a 10’ by 10’ grid (measured every one foot), recording the returned position data for five trials at each location. The average of these readings will be checked against the precision requirements already defined. This test will be performed for the visual and auditory localization separately and together.

Communication between Planner Programs will be tested by setting up an additional AIBO in the same manner as that already described. A packet-sniffing tool will then be run on the PC in order to ensure data is flowing between Planner Programs.

Motion planning requirements will be tested by ensuring that, given a number of different starting positions, the robots do not impede each other’s movement for more than a set number of seconds in a fixed timeframe.

Fault tolerance requirements will be tested by shutting an AIBO down randomly. Behavior will be checked to ensure that it responds with little apparent change.

3 Display Program

The same testing environment as the Planner Program will be employed, with the exception that the PC software (packet sniffer, etc.) will be replaced with the actual Display Program. Tracking the robot’s position in real-time and recording the timestamps associated with unique sensor measurements will test performance lag requirements. The same will be done with the Display Program. The average difference between timestamps for same measurements should not differ by more than that specified in the requirements (see Section 4.2).

User interface requirements will be tested by ensuring that all user interface controls affect the expected behavior on the robot.

2 Demonstration

The team will need to be deployed in several scenarios (no source, one source, and multiple chemical sources) to demonstrate the robustness of the detection and search capabilities of the system and in two different rooms to show the adaptability of the localization and coordination systems. The demonstration of the system will need two small, enclosed rooms with limited obstacles and normal lighting. The most promising room seen is Moore 212. This room is the right size and sufficiently simple (rectangular, limited obstacles if chairs are removed, plain white walls, etc.).

There are specific problems associated with searching for fire in the demonstration, which do not come up in searching for alcohol. It is unrealistic and dangerous to set a small fire in a room without a person in it to control the fire if something goes wrong. Instead, what will be necessary is to make a model that resembles a fire (i.e. something that gives off the same chemical, visual, and heat signatures as a fire). A light bulb and space heater may be effective alternatives to the light and heat source. Tests will need to be done to determine if their characteristics match a fire closely enough. The chemical signature will be more difficult, since a small CO2 source that dispenses CO2 at a limited rate will be needed. The amount of CO2 actually given off by a small fire will also need to be determined.

5 SCHEDULE

[pic]

There are three main parts of the project that we were to have accomplished or worked on this semester according to the schedule: localization scheme design, chemical sensor properties determination, and interfacing the Sony AIBO with the Motorola HC11.

We are on schedule with the localization. We have designs for the visual localization and auditory localization accomplished. There is still a large amount of software development to be done on the AIBO for implementing the localization schemes, but as of now we have the conceptual basis and the mathematical models the schemes. The visual localization is particularly well along in that we have error analysis accomplished.

The sensor investigation is shown, as behind on the schedule, but this does not accurately reflect the state of this work. Initial investigation of the TGS 2620 alcohol sensor uncovered unexpected problems, prompting investigation of alternative sources to search for (e.g. fire) and thus other sensors (e.g. dog’s camera for finding a flame, LM 34 temperature sensor, TGS 4160 CO2 sensor) need to be tested. The camera and LM 34 temperature sensor have undergone testing, although further testing of the LM 34 with a large heat source is needed (see Section 5.1.3) and there were problems obtaining the TGS 4160 that prevented earlier testing. The determination of these sensor properties is top priority and will be completed early next semester.

The interfacing of the HC11 with the sensors to the Sony AIBO is behind schedule in some parts. The HC11 program to obtain sensor data from the HC11 A/D port has been finished, but actual interfacing of the HC11 to the AIBO has not been done. Early on it was discovered that the AIBO’s RS-232 serial port baud rate (19200 bps) was incompatible with any HC11 that the University has. Ordering special new HC11s with the proper clock speed to do serial communications at 19200 bps has taken more time than expected. Barring any unforeseen difficulty, it should take only a couple of weeks after getting the boards to get the AIBO communicating serially with the HC11.

5 RESULTS

1 SENSOR INVESTIGATION

This section will discuss the results of testing of the TGS 2620 alcohol sensor and preliminary testing of the Sony AIBO’s camera and the LM34 temperature sensor (in regards to detecting fire). These experiments were to determine the capabilities of these sensors to determine if it is more practical for the AIBO team to find an open, sitting alcohol source in a room or a small fire. The results from the tests indicate that both are possible, but have obstacles. Since many of the other parts of the system are independent of what we are searching for, both the alcohol and fire detection scenarios will be investigated, but greater emphasis will be put on chemical detection.

1 TGS 2620 Ethanol Alcohol Sensor

Tests on the sensor show that it can correctly detect the presence of Ethanol (i.e. will not miss it if it is present) to within 2’ to 3’ of the source. The major problem was false positives, where a very strong signal that would indicate close proximity to the source (~1’) is detected from further away (~3’).

The measuring circuit for the sensor is a simple voltage divider with RS (internal resistance of sensor, which varies inversely with ethanol concentration) and RL (3600() and an input voltage of 5.0V (see Figure 3). As the ethanol concentration increases the VL, the sensor output, increases. The major experiment performed was to leave an open source of alcohol in a closed room, let the room atmosphere saturate with alcohol for approximately 20 minutes, then take the sensor and place it along a straight line with the source and move it towards the source (positive direction in graph) in set increments then passed the source and then back. The sensor is left at each distance for 60 seconds and its peak voltage and average voltage during this time period is measured using the HP54600B Oscilloscope.

As can be seen in Figure 6, there is a strong increase in voltage signal (particularly peak voltage) around –1’. The sharp decrease in signal positive of the source can be attributed to a slight draft in the room in the negative direction. As you can see the peak voltages are often much higher than the average; this is because often times at close proximity to the source the voltage will have a high peak for a short time, as shown in Figure 7, thus making the peak high but the average over the interval small.

[pic] Figure 6 - Ethanol sensor test results

[pic]

Figure 7 - Sensor output from a 60 second measurement interval at the +1’ position

These results indicate that the peak voltage within the time period is a more accurate determination of alcohol presence (i.e. if a high average occurs then there is certainly an alcohol source present and probably within two feet, but if there is a high peak without a high average there still may be a source within two feet). The data indicate an average threshold of between 1.0V and 1.1V and a peak threshold of between 1.3V and 1.5V indicate the presence of an ethanol source within 1.5’.

Specific trials sometimes showed strange anomalies (e.g. a trial with peaks of 1.238V at –3’ and 0.9V at –2’) that could possibly lead to false positives. Essentially the alcohol sensor will know if a source is present and on average find its location well, but occasionally an anomaly could mark the source position ±1’ or more from the source.

2 Camera

The initial tests done on the SonyAIBO’s camera indicate that it can distinguish a candle-sized flame from its background in normal room lighting from up to seven feet and can spot the flame from one foot in most lighting and background conditions. A flame produces a large amount of IR that will overexpose the pixels of the SonyAIBO’s charged coupled device (CCD) camera (according to Dr. Lee). This will make the image of a flame on the AIBO camera very bright white independent of the actual color of the flame itself. A MATLAB program, FlameObserver, was written to try to pick out these “bright” pixels from normal ones so that the AIBO could find the flame (see Section 14). The program checks all pixel values (red, blue, and green) to determine if all are above 94.12 percent of the maximum value possible (maximum in red, green, and blue indicates pure white). If the pixel meets these “bright” requirements it is displayed as white – otherwise it is black.

Figure 8 shows the FlameObserver output in normal room lighting conditions from one foot and seven feet away from the candle-sized flame. The white spots at the top of the photos are halogen lamps.

|[pic] |[pic] |

Figure 8 - FlameObserver output in regular room lighting from 1’ and 7’.

The AIBO can thus identify the flame. The only major problem is false positives, either from lights or from bright spots or reflections.

As is seen in Figure 9, only in intense lighting conditions with a reflective background is the picture totally white. In this case it would be impossible to find the flame by sight. In other cases with bright spots other than the flame, candidate flames would be found and each would be investigated with other sensors (e.g. temperature sensor). Also the RGB characteristics of the flame vs. white spots and bright spots will be investigated to determine if a more complicated flame detection algorithm can separate these different “bright” sources.

|[pic] |[pic] |

|Above: background is white wall. |

|[pic] |[pic] |

|Above: Background is reflective white poster board |

Figure 9 - FlameObserver output from 1’ from flame with two different backgrounds in strong (left) and intense (right) light.

3 Temperature Sensor

Preliminary investigation of the temperature sensor shows that a small candle-sized flame from a butane lighter will produce a detectable temperature change on the LM34 temperature sensor from 1’’. It can be extrapolated reasonably from this that a much larger fire will produce enough heat to create a temperature change noticeable by the LM34 from 4’’ to 6’’ away. This allows the AIBO some flexibility in how close it has to get to a potential flame detected by the camera (see Section 5.1.2) to see if it is hot enough to be flame.

The LM34 has an accuracy of ~1 (F and ambient room temperature can vary, so it seems that an increase in temperature of ~3-4 (F in a 20 or 30 second period is needed to indicate that a heat source (flame) is present. As seen in Figure 10, this occurred at ~2-2.5cm from the flame. This is a very short range, but this is a very small flame. Since the dogs are searching for a larger flame, a more substantial heat source needs to be found, that is safe indoors, to test with. The major possibility is a space heater.

[pic]Figure 10 - Temperature changes from small butane torch as detected by LM34 sensor from different distances (exposed to flame for 20 seconds in two trials and 30 seconds in two trials)

2 VISUAL LOCALIZATION

Work has been done to quantify the effect on errors in the measurements needed to localize a search dog of the resulting error in the location estimate. Specifically, the anchor dogs know how far apart they are from each other with some finite error. Similarly, they can determine the angle between the other anchor dog and the search dog with some finite error, both of which will introduce error in the calculated location.

To investigate this, a spreadsheet was designed that incorporated four parameters: two angles θ and Φ and the two distances from the point of entry to each dog, a and b (where nominally a = b = d/2.) It also incorporates the four possible errors: Δθ, ΔΦ, Δa, and Δb.

Figure 11 offers an example of this analysis. This example case illustrates the effect of error in angle on position. It assumes that a and b are known exactly as five while the error in θ and the error in Φ (set here to the negative of the error in θ) vary from the nominal θ = Φ = 45º. The correct location is (0, 5).

[pic]

Figure 11 - Error in position due to nominal 45 degree angle.

In this case, there can be as almost six degrees of error in the angles before the localization is off by more than a foot in any direction, which provides a large enough margin of error.

However, this differs from the case where a and b are still known exactly as five, the error in θ and the error in Φ vary in the same manner, but the nominal θ = Φ = 80º. The correct location is now (0, 28.36). In this case, the error in location is intolerable with a less than two-degree error in the angle. The situation becomes far worse if the error for in figure 11, where both θ and Φ are in the same direction. Here, an error in the angle as small as 0.4 creates position error greater than a foot.

[pic]

Figure 12 - Error in position due to nominal 80 degree angle.

Clearly, the nominal angles, which are determined by how far away the dog is, strongly affect the amount of error in the angle that is tolerable. Assuming that angle error can be minimized to about two degrees in each angle, which Dr. Lee suggests is a good estimate but needs to be experimentally investigated, distances of about ten feet from the origin can be reached before the error in position becomes more than one foot.

While error in the locations a and b of the anchor dogs also introduces error in location, the overall effect was still highly dependent on the angle and the error in the angle. This, in addition to the fact that the error in a and b can be minimized by verifying d via auditory localization, leads to the conclusion that the error in the angle is the larger of the two concerns but that, given a bounded region of ten feet, it can be overcome. Search pattern methods and integration of auditory localization will be needed to extend the search beyond these bounds, but exploring further will likely come at the cost localization accuracy.

6 LESSONS LEARNED

7 EQUIPMENT / FABRICATION NEEDS

There are several parts required for this project that need to be purchased specially:

• Several different chemical and location sensor types will be investigated, meaning one of each type will be needed. Once a sensor has been picked, one sensor will be needed for each of the three AIBOs. Thus, approximately five sensors will need to be purchased.

• Three new HC11 Evaluation Boards (one for each of the dogs) will need to be purchased. These boards will have special requirements. First, they will need to have an oscillator chip capable of supporting a RS-232 baud rate of 19200 bps. Second, they will need large amounts of EEPROM, since the HC11 sensor program will need to be stored permanently on HC11’s running on battery power.

• Ribbon cables and break-out boxes will need to be obtained. The break-out boxes are no longer offered by Sony, but may be obtained from other sources. The particular type of ribbon cable will need to be determined and purchased.

8 CONCLUSIONS AND RECOMMENDATIONS

9 NOMENCLATURE

AIBO: Artificial Intelligence Robot

CCD: Charged Coupled Device

CCSS: Coordinated Chemical Sensing System

EVB: Evaluation Board

GUI: Graphical User Interface

PC: Personal Computer

RGB: Red, Blue, and Green

10 REFERENCES

1. Olivier Rochel, Dominique Martinez, Etienne Hugues, Frédéric Sarry. “Stereo-olfaction with a sniffing neuromorphic robot using spiking neurons.” 2002. 19 October 2003 .

2. Michael Wandel, Achim Lilienthal, Tom Duckett, Udo Weimar, Andreas Zell. “Gas Distribution in Unventilated Indoor Environments Inspected by a Mobile Robot.” 27 May 2003. 19 October 2003 .

3. David Cohen, Yao Hua Ooi, Paul Vernaza, Daniel D. Lee. “The University of Pennsylvania Robocup 2003 Legged Soccer Team.” 2003. 19 October 2003 .

4. Figaro Company. “TGS 2620 - for the detection of Solvent Vapors.” 22 October 2003 .

5. Technological Arts. “Adapt11 Modular Prototyping System.” 23 October 2003 .

6. Motorola Inc. “M68HC11E Family Technical Data.” 2000.

11 BIBLIOGRAPHY

1. Michael Wandel, Udo Weimar, Achim Lilienthal, Andreas Zell. “Leakage Localisation With A Mobile Robot Carrying Chemical Sensors.” September 2001. 19 October 2003 .

2. Patraporn Techapaibul, Court A. Zabel. “Sensor Suit and Control for a Team of Autonomous Robots.” University of Pennsylvania School of Engineering and Applied Science. April 29, 2001.

12 FINANCIAL INFORMATION

Sensors constitute one category of expenses. One sensor is required for each dog, and a few extras will be needed for experimentation to determine which types of sensors will ultimately prove most useful. The other are of expense are the three HC11 EVBs needed for the three AIBOs. Because of the RS-232 baud rate and EEPROM requirements for the boards, special EVBs will be purchased. The most likely candidate is the Adapt11C24DXSP60KT9 (Oscillator speed of 9.8304 MHz, 32K of EEPROM, and 28K of external RAM). Our advisor Dr. Daniel Lee has offered to pay for these EVBs.

Table 1 - Budget

|Product Name |Model Number |Manufacturer |Price |Number of Units |

|Alcohol sensor |TGS 2620 |Figaro USA, Inc. |$14.50 |2 |

|CO2 sensor |TGS 4160 |Figaro USA, Inc. |$45.00 |3 |

|HC11 EVB |Adapt11C24DXSP60KT9 |Technological Arts |$138.00 |3 |

|Total | | |$578.00 | |

|Less Outside Funding for EVBs | | |$-138.00 |3 |

|Remaining Total | | |$164 | |

13 ETHICS

14 SOFTWARE DOCUMENTATION

See Appendix 1 for Software Documentation.

APPENDICES

A1 Code for Reading Sensor Data into HC11

A2 Code for FlameObserver (visual identification of a flame)

A1: Code for Reading Sensor Data into HC11

/* sensorcap.c, revision 1.0

* Author: Paul Vernaza

* Date Created: November 12, 2003

* Purpose: Drive the conversion of analog sensor data to digital, protocol-

* based format.

* Location/Date of last backup: Raymond Ciarcia’s PC, December 10, 2003

*

* Revision history:

* Revision 1 -- first revision, November 12, 2003

* General operation: on each TOI, perform digital conversion and protocol

* conversion.

***************************************************************/

#include

#include

#pragma interrupt_handler TOIISR

int time_to_take_reading = 0;

void TOIISR();

void main(){

int sensor_value = 0;

OPTION &= 0xBF; //clear bit6

OPTION |= 0x80; //power up the A/D system by setting bit7

//delay for 100 ms for the capacitors to charge in the A/D system

TFLG1 = 0x40;

TOC2 = TCNT + 200; //add 200 E-clock cycles which is equal to 100 ms

while (!(TFLG1 & 0x40));

//pseudo vector for Timer Overflow Interrupt

*(unsigned char*)0xD0 = 0x7E;

*(void(**)())0xD1 = TOIISR;

TMSK2 |= 0x80; //locally enable TOI

asm("CLI");

while(1){

ADCTL =0x01; //set up single channel nonscan mode for AN1(PE1)

while(!time_to_take_reading);

while (!(ADCTL & 0x80)); //poll the CCF flag

time_to_take_reading = 0;

sensor_value = ADR1;

printf("SENSOR %d\n", sensor_value);

}

}

void TOIISR () {

TFLG2 &= 0x80; //clearing TOF to enable further interrupts

time_to_take_reading = 1;

}

A2: Code for FlameObserver (visual identification of a flame)

% FlameObserver Revision #1

% Author: Brian Corwin

% Date Created: 12/3/03

% Purpose: Visually identify the presence pf a flame

% Location/Date of last backup: ENIAC, December 11, 2003

% Revision history:

% Revision 1 -- This Matlab program will use threshold of 240 out 256

% for r,g, and b for all pixels. If a pixel meets this standard

% then it is "bright" and is displayed as white. If not then it is

% displayed as black. The purpose is to see if in code the

% "bright" pixels of the flame can be extracted from the

% background.

function YUVObserver(nFrame);

% Code for allowing infinite frames

% ie display indefinitely

if nargin < 1,

nFrame = inf;

end

% Width and Height Of Image

yuvWidth = 176;

yuvHeight = 144;

% Sep up hyuv image

hyuv = image(zeros(yuvHeight,yuvWidth,3));

htitle = title(' ');

set(hyuv,'EraseMode','none');

iImage = 1;

% Cycle through infinitely getting and displaying frames

while iImage ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download