Robot Design Proposal .ca



Autonomous Mobile Robot Design

The Enterprise

“Going where no robot has gone before.”

Michelle Hyers

Audrey Tam

Rachael Tlumak

Brendan Woods

February 1, 2010

ME 412 – Autonomous Mobile Robots

Professor Mar

The Cooper Union for the Advancement of Science and Art

Albert Nerken School of Engineering

Table of Contents

Introduction 3

Design Requirements 3

Design Objectives 4

Mechanical Design 5

Chassis 5

Drive Train 7

Firing Mechanism 9

Mounts 10

Electrical Design 12

Sensors 12

Programming 17

Hardware 24

Conclusion 25

References 26

Appendices 27

Appendix I : AutoCad Drawing 27

Apendix II : Robot Code 28

Introduction

Design Requirements

The goal of the project is to design and fabricate an autonomous robot. This robot (whose maximal height is 1 foot long by 1 foot wide by 10 inches high in all modes of operation) is meant to compete with other robots in a game, where the objective is to score as many points as possible by shooting reload-able ping pong balls at the opponent’s home base and/or robot. The arena (Figure 1) in which this takes place will be a 6 feet x 10 feet rectangle, enclosed by 1-foot high walls. Several ‘obstacles’ (white objects approximately 1 foot high each), to be avoided by the robot, and will be randomly placed on the arena.

[pic]

Figure 1: Battle Arena Schematic

Design Objectives

In order to accomplish the above requirements the design of the robot entails the consideration of a broad range of capability. In fact, such a robot must be devised with the following qualifications:

- Mobility

- Rapidity (the more balls fired, the greater chance to score more points)

- Shooting a ball / reload ability

- Sensing: obstacle avoiding, detecting own and opponent’s home base, and sensing opponent robot’s proximity.

- Self protection: the objective being to get shot the least amount of times

- Decision-making: the robot must be able to react accordingly to a certain situation, be it obstacle, opponent robot or home base.

Therefore, the robot’s design will be broken up into three main categories, which will each contribute to qualify the robot for a successful game:

- Mechanical design: chassis, motors, drive train, wheels, and everything regarding the physical parts of the robot and how they can be optimally positioned.

- Sensing: appropriate sensors will be chosen and used according to the requirements.

- Programming: the robot will controlled autonomously by the ‘handy board’, a microprocessor based robot controller designed for personal and educational robotics projects.  The programming language used for enabling the robot with decision-making skills is Interactive C.

Mechanical Design

A discussion of the individual components of the mechanical design follows. An AutoCAD drawing of the overall robot design is available in Appendix I of this report.

Chassis

The chassis is based on a two tier design (Figure 2) chosen to maximize the available space for the mechanical parts (drive train, gun mechanism), electrical parts (handiboard, batteries), and sensors (light sensors, short range IR sensors). The bottom layer is an 8-inch diameter circle cut off at two parallel chords to form two flat edges on opposite sides from each other. These edges provide space for the wheels, allowing them to be mounted as closely as possible to the motors. The second layer is an unmodified 8-inch diameter circle.

[pic]

Figure 2: Two-layered chassis design.

Both layers are made from round discs of CNC machined expanded rigid PVC purchase from (Figure 3). This material is chosen because it is lightweight and strong, easy to machine, and produces very little static electricity, reducing the possibility of damaging electrical components with a static charge. The layers came with twelve 1/8-inch diameter predrilled holes, equally spaced about the circumference of the circle and ½ -inch from the circle’s edge. A hole in the center is of 3/8-inch diameter.

[pic]

Figure 3: Round chassis layer material with predrilled holes, as shipped by manufacturer []

The two chassis layers are connected by three 5-inch long spacers. The length of the spacers is chosen to allow clearance between the two layers for the parts mounted on the top of the bottom layer. The spacers are made out of long bolts and 5-inch long pieces of rigid metal tube of a diameter slightly larger than the diameter of the bolts. The three bolts are threaded through the bottom layer using the pre-drilled holes and are secured with nuts. The metal tubes are then put on the bolts and the upper layer of the chassis is placed on top, again secured by nuts. This spacer design allows a high degree of adaptability, as spacer height can easily be changed by changing the height of the metal tubes placed between the two chassis layers.

The circular shape of the chassis is chosen to help meet the design criteria of obstacle avoidance. Since a circular shape has no sharp corners, the robot will not get caught on any obstacles. Also, its bulk is somewhat evenly distributed about the robot’s pivot point, assuring that when it turns, no part of the robot will be accidentally swung into an obstacle or wall.

When designing the chassis, consideration of the placement of all the robot components had to be made. In the final placement of all the components, the drive train (motors and wheels) is mounted on the bottom of the lower layer with a corresponding battery pack, the Handy Board and short range IR sensors are mounted on the top of the lower layer, the light sensors are mounted on the bottom of the upper layer, and the gun, electronics battery pack, and heat signature are mounted on the top of the upper layer (Figure 4).

[pic]

Figure 4: Electronic and mechanical component placement on chassis.

Drive Train

The drive train consists of two servo motors, two double o-wheels, two hubs, a ball caster, and a pack of four AA batteries. The servo motors are modified GWS S03N Standard Servos, altered for continuous rotation. Each servo is 1.55 inches by 1.40 inches by 0.79 inches, weighs 1.48oz, runs on 4.8-6 volts, has a speed of 0.23-.18 sec/60º, and has a torque of 47-56 oz-in. The servo motors are attached to the wheels by custom made hubs and mounts. The mounts are flange style servo brackets, shown in figure 5.

Figure 5: Servo mounts []

The wheels are of 2 ½ - inch diameter and are lined around the perimeter with two rubber o-ring that provide traction. The wheels are attached to the hubs by three bolts for sturdy connection. The wheels, hubs, and servo motors were all purchased from as compatible parts.

A ball caster is mounted to the bottom of the lower chassis layer at the front of the robot. The two wheels are mounted at the third rear portion of the robot. This placement provides a stable tripod foundation to support the robot.

Skid steering is used for turning. This means that, in order to turn, the wheel on the side of the desired turn direction is stopped and the opposite wheel is driven forward. For example, if a right turn is desired, the right wheel is kept stationary and the left wheel is driven forward.

Firing Mechanism

The original plan for the gun was to use the shooting mechanism from a Nerf gun mounted on the top layer of the chassis with a servo pulling the trigger. However, once constructed, the servo could not produce the torque necessary to reliably pull the gun trigger. The team decided that reliable shooting was a priority, and the modified Nerf gun was scrapped.

The new design goal was to create a firing mechanism that used the pre-existing mounts and parts left over from the failed Nerf gun idea. These mounts and parts included a clear plastic tube attached to the top chassis layer that held the ping-pong ball in place, and a servo mounted vertically with an attached lever arm that swung in the horizontal plane. This design goal was met with a weighted hammer mechanism.

In the final design, one side of a metal hinge is attached to the top of the plastic tube. The other side of the hinge is allowed to swing freely so that, if allowed to fall, it will hit the ball and reach equilibrium at a vertical position with respect to the chassis. The ball is positioned within the plastic tube, held in place by small plastic strips that act as stoppers, in a placement such that the hammer hits the ball when released, propelling it forward and out of the plastic tube. To increase the force applied to the ball by the hammer, and therefore increase the distance the gun can shoot the ball, a metal bolt with multiple nuts is attached to the end of the hammer, weighting it. With the final implementation of the hammer-gun design, the gun is weighted with one bolt and three nuts, which provides enough force to propel the ball forward about six inches past the front end of the chassis.

The hammer is held in the pre-fire position by the servo-arm. The servo is mounted by L-shaped brackets of metal on the side of the gun, and a strip of metal is attached to the servo arm so that it extends in the horizontal plane. This strip of metal is positioned under the hammer to hold it up. To fire, the servo-arm is turned, rotating the strip of metal out from beneath the hammer, allowing it to swing down into the ball. A diagram of the gun, minus the servo, is shown in figure 6.

[pic]

Figure 6: Diagram of gun that uses a hammer mechanism to propel the ball.

Mounts

The handy board is mounted to the top of the lower chassis layer by two metal brackets. The metal brackets are placed diagonal from each other on opposite sides of the handy board. A bolt through one side of the bracket attaches it to the chassis and a bolt through the other side goes through the pre-drilled hole on the side of the handy board, securing it.

Three short range IR sensors are mounted to the top of the lower chassis layer. The mounts consist of rectangular pieces of Lexan to which the sensors are attached. These rectangular pieces are adhered to the chassis via double-stick tape, to facilitate easy adjustment of sensor position. The light sensors, which are attached to the bottom of the upper chassis layer, are also mounted via Lexan rectangles adhered to the chassis. The sensors are inserted into holes through the centers of these Lexan rectangles to keep them in place.

Electrical Design

Sensors

Our initial design requirements involved the following issues: we needed the IR range-finding sensors to be low enough to detect all obstacles, while being high enough to see over any ‘debris’ present on the field, such as stray ping pong balls, and the light sensors had to be placed at approximately the height of the ‘base’ LED’s to optimize their ability to track the goal. To incorporate these requirements into the design of the robot’s sensor array, the sensors present on the robot are split between the two tiers. Each tier concentrates on a different detection aspect, the lower tier being outfitted with IR sensors and the upper tier with light sensors. Initially, the lower tier would consist of an array of six IR sensors spread out across the front half of the robot, with two IR sensors facing directly forward, two directly out from the sides, and two at a slight angle from robot’s direction of forward motion. However, due to an interference phenomenon we observed when the IR sensors were placed too closely together forced us to forgo the extensive array we originally planned. Therefore the final design required the use of only three IR sensors, in the forward, slightly left and slightly right directions as shown in Figure 8. This allowed the robot to still accurately observe its surroundings, while minimizing the amount of interference they created in each other. Another change from the original design was the decision to use only short-range IR sensors to do range-finding. This was mostly a practical matter, as it became increasingly difficult to determine the distance to an object perceived by each of the two types, and eventually led to erratic behavior in the robot. One last element that we found necessary to compensate for was the minimum accurate distance of the sensors – once within a certain range, diffraction of the emmited IR light would cause an object to register as much further away than it actually was. As the robot would interpret this data to mean that the direction in which the obstacle lay was the most clear, it was necessary to come up with code to filter the data sent to the Handy Board from these sensors.

[pic]

Figure 7: Picture of IR Sensor

Figure 8: IR Sensor Layout

For the upper tier, the initial design was to use five to six of the light sensors in a design similar to the original IR sensor array, or to attach some to the swinging arm that would have housed the ‘heat sensing’ phototransistors, but the final design used only three light sensors in a layout that mirrored the IR sensors on the lower tier. The reason behind this change was an issue of accuracy with the light sensors – they were so sensitive that the position of the light source between two adjacent sensors became almost impossible to determine. It was found that by reducing the number of light sensors, the robot could more accurately determine in which direction the goal was located. In addition, by fixing the sensors to the chassis, determination of the goal’s direction was vastly simplified. One last concern that arose with the light sensors was related to their accuracy. We had used a milled clear plastic to house the sensors and keep them anchored to the robot – however, it was quickly discovered that the plastic allowed too much ambient light into the sensors through the sides of the housing, thus compounding the problem that arose with the close configuration originally chosen. This problem was solved by wrapping the clear plastic (excepting the front of each piece) in black electrical tape, to keep ambient light from interfering with the sensors.

[pic]

Figure 9: Light Sensor Schematic (Dimensions in inches)

Figure 10: Light Sensor Layout

Originally, there was to be a set of ‘heat sensing’ phototransistor diodes located above the firing mechanism, that would sweep out a specific arc in front of the robot to detect any enemy robots. However, it was determined that, in addition to some other concerns about robot-hunting behavior, this was an unrealistic manner in which to detect quickly moving robots. It was also determined that such a system would not be able to accurately zero in on a target with the speed necessary to hit a moving target. This, along with some other mechanical concerns, led to the omission of the ‘heat-seeking’ phototransitors.

[pic]

Figure 11: ‘Heat Sensor’ Phototransistor

[pic]

Figure 12: Devantec Compass

The last sensor component present in the original design, the compass, was not included in the final iteration of the robot, for good reason. There was a known problem with accuracy from the compass, but it was thought that this could be compensated for by comparing all readings to some initial reading taken by the robot to determine relative direction. However, it was noted that even after the compass was connected correctly and code in place for its use, there was no consistent data being given by the compass. Essentially, the compass could not be relied upon, as it could detect the difference in the magnetic field at various points around the field, and so would give erroneous data to the Handy Board. As the compass was only ever to be used to ensure that the robot did not fire at its own goal, a programming work-around was created to compensate for the compass’s absence.

Programming

The programming was done on the Handyboard (Figure 13) that included the expansion pack (Figure 14) using Interactive C. The initial programming goals of the robot were as follows: obstacle avoidance routines based on the IR range finder inputs, pinpointing the other team’s LED target by using the photosensors, locating of the enemy robot by using a sweeping long range IR sensor, and routines communicating between sensors, motors and launcher to allow for movement towards the bases and subsequent firing at them.

[pic]

Figure 13: Handyboard

[pic]

Figure 14: Handyboard expansion pack.

The initial strategy that the code was to be based upon was at the start, the robot would keep track of the home goal relative to its compass so that it never attempts to fire at its home goal. The fast moving robot would constantly be looking for the LEDs of the enemy base, and the IR heat signature of the enemy robot, while avoiding obstacles in its path. If the enemy robot came in close proximity, the robot would shoot at the enemy robot. Otherwise if no such opportunity presented itself, the robot would proceed to the location of the enemy goal and then fire upon it. Once firing was completed, the robot would make a 180 degree turn according to the compass and then navigate to the home base using the IR and light sensors.

In the final design the strategy was altered due to a few discoveries made during development. The first issue was the choice of the speed of movement of our robot. During testing of the sensor programming, a slower speed resulted in more accurate responses. As a result, a slower and more controlled movement was utilized by using servos instead of gear motors. However, since other teams continued to pursue a quick paced robot, the strategy to attempt to shoot at the enemy robot became an impractical goal. The second issue was the inaccuracy of the compass found during testing. Due to the inaccuracies, the compass could not be trusted to navigate the arena, and prevent shooting at one’s own goal. The strategy was altered to compensate by relying on solely the sensors for navigation, which were programmed to have a reactive response to specific conditions. Since the robot programming was not aware of absolute directions, a timer was added to prevent activation of the firing mechanism at the home base. The timer started when the robot was powered on and would not allow firing to occur for twenty seconds in case of a quick turnaround of the robot. After firing at the enemy target had occurred, the robot would return to the home base in the same way it had sought the enemy base. Upon return to the home base, firing did not have to be prevented since the gun at this point would be empty.

The key aspects of the final programming design are obstacle avoidance, light detection of the bases, and firing that were integrated in a decision based logic structure. The final program was built up from independently tested programs for the IR sensors, light sensors and firing mechanism. These programs were optimized separately to account for the competition conditions, and then integrated in a decision structure. The decision structure utilized the data collected from each separate program, as well as the most effective and accurate movement strategies that were determined from the testing of the individual programs.

The IR sensor program acted as an obstacle avoidance program that was based upon the data received from an array of three short range IR sensors. The IR sensors program provided a 1D view of their environment, and a programmed reaction to its surroundings. The default condition was when the IR sensors recorded extremely far obstacles in all directions; the chosen action was to go towards the right direction. The right direction was selected instead of the more obvious forward direction since the rules determined that there would be no straight path available across the field, and selection of a forward path would increase the chance to encounter obstacles. If fairly close obstacles were detected, the robot would move forward in the direction of the IR sensor that recorded the largest distance, which indicated the least obstructed path. If a set of obstacles presented itself in which all the sensors described extremely close obstacles a turn about its center would be initiated until this was no longer the case.

The light sensors program was the light seeker routine for the robot, in which three light sensors provided data concerning the value of light in specific directions: left, middle and right. The robot would move in the direction of the brightest value, which is the lowest analog signal received. Several conditions were tested with the ambient light at high levels, low levels and extremely low levels. At moderate ambient light conditions when the light source was not very close, the strategy to move towards the brightest reading resulted in erratic movements due to noise from ambient light. In order to avoid any erratic movements due to noise in the light sensor data, a buffer value was made so that no change in response would occur unless the difference between the darkest and brightest values was higher than the buffer value. This buffer value was experimentally optimized so that any ambient noise could be eliminated and the robot would only respond to a true light source, specifically the LEDs of the bases.

The servos movements were controlled through utilizing the built-in servo functions. By initializing the servos to a value from 0 to 2540 the servo would turn in a counter-clockwise direction and a value from 2540 to 3000 the servo would turn in a clockwise direction. The value of 2540 was the center point of the servo which would keep the servo at a fixed position. The firing mechanism consisted of a single servo motor that could be set to move in a specific direction when the firing conditions had been met. The firing condition required by the limitations of the range of our firing mechanism requires a maximum of approximately 5” between the light source and the front of the robot.

The integration of these programs was based upon consideration of the different situational cases and a programmed response based upon decision-based structured code.

[pic]

Figure 15: Pictorial representation of the programmed responses for all six possible cases.

These cases are listed in Table 1 and shown pictorially in Figure 15.

|Case |Description of Case Detected |Programmed Response |

|I |No light |Move towards right |

| |Far Obstacles | |

|II |No light |Move towards least obstructed path |

| |Close Obstacles | |

|III |No light |Move to avoid obstacles |

| |Extremely Close Obstacles | |

|IV |Light |Move towards light |

| |Far Obstacles | |

|V |Light |Turn about center towards light source |

| |Close Obstacles | |

|IV |Light |Fire |

| |Extremely Close Obstacles | |

Table 1: Outline of situational cases and the corresponding response.

The program for cases I-III function directly from the IR sensor or obstacle avoidance program developed. Cases IV-VI indicates a scenario when the light sensor program detects a true light source, as determined by the calibrated buffer value. The last three cases require integrated programming between the light, IR and firing programs. For case IV, since there are only far obstacles, the robot can move towards the light. For case V, however, the IR sensors would typically want to move towards the least obstructed path and the light sensors would move towards the brightest light. This problem can be solved by understanding that the light source or the LED at the base is fixed and is viewed by the robot’s IR sensors program as an obstacle. Therefore, in Case IV, the integrated programmed response is to move towards the light and to disregard the IR sensor readings. This solution will not occur in a collision, since besides the base light there is no other scenario when an obstacle emits light. The firing case or Case IV is when the directly forward facing light sensor records an extremely bright value indicating only a few inches distance to the light source and the IR front facing sensor also records a very close obstacle. Under these conditions for Case IV, this would be the optimal time to fire since the light source is close and directly forward.

The programming based upon a reactive approach was successful in the competition, since it eliminated the need for other position recording devices from optical mice to encoders that add to the amount of data processing and programming required. This program design would only require slight modifications for different sensory cases such as high ambient light conditions, but would able to navigate in many types of environments due to the reactive style programming.

Hardware

➢ POWER

o Independent Servo Power ( what cuts/alterations were made to Handyboard)

o General power concerning Handyboard – aka power management

➢ WIRING

o of Servos ….

o Wiring of Sensors…. (IR, light) (Be sure to mention the cuts/alterations to Handyboard – to cut the pull-ups )

▪ IR on expansion ports

▪ Light on analog –handyboard ports

[pic]

Figure 13: Wiring of IR sensors

➢ TROUBLESHOOTING

o Problems with adapter burning out

o Low power problems

Conclusion

- Lessons learned

o Problems with adapter burning out

o Interfence between IR sensors

o More sensors isn’t necessarily better

o Erratic compass reading – decided not to rely on it

- Results

o Good performance

o 1st place

References

Appendices

Appendix I : AutoCad Drawing

Apendix II : Robot Code

#use "alc_cmps03.ic"

/* Define Global Variables */

int IR_Left;

int IR_Mid;

int IR_Right;

int Light_Left;

int Light_Mid;

int Light_Right;

int Heat_Left;

int Heat_Mid;

int Heat_Right;

int Heat_Flag;

int Light_Flag=0;

int Heat_Direction;

int Light_Direction;

int IR_Direction;

int IR_High;

int IR_Low;

int IR_moveaway;

int Light_Low;

int Light_High;

int abs_dir; /* Part of proposed compass subroutine */

float time_past=0.00;

float time_difference;

float current_time;

/* Main function that loops through sensor and decision subroutines*/

int main () {

servo3=1000;

reset_system_time;

abs_dir=compass();

while(1) {

/*check_Heat(); */ /* Code not used for the final design */

check_Light();

check_IR();

decide();

}

}

void check_Light() {

int light2=0;

int light3=0;

int light4=0;

int brightest=255;

int LightBuffer=40; /* Experimentally optimized value */

Light_Flag=0;

Light_Left=analog(2);

Light_Mid=analog(4);

Light_Right=analog(6);

light2 = analog(2);

/* These statements determine the brightest light sensor reading */

if (light2 < brightest) {

brightest=light2;

Light_Direction=1; }

light3 = analog(4);

if (light3 < brightest) {

brightest=light3;

Light_Direction=2; }

light4 = analog(6);

if (light4 < brightest) {

brightest=light4;

Light_Direction=3; }

printf("%d, %d, %d \n", light2, light3, light4);

/* Unused subroutine for determining brightest & darkest values*/

//If Left is brightest

/* if ((Light_Left < Light_Mid) && (Light_Left < Light_Right)) {

Light_Low=Light_Left;

Light_Direction=1;

if (Light_Mid < Light_Right)

Light_High=Light_Right;

else

Light_High=Light_Mid; }

//If Middle is brightest

if ((Light_Mid < Light_Left) && (Light_Mid < Light_Right)) {

Light_Low=Light_Mid;

Light_Direction=2;

if (Light_Mid < Light_Right)

Light_High=Light_Right;

else

Light_High=Light_Right; }

//If Right is brightest

if ((Light_Right < Light_Mid) && (Light_Right < Light_Left)) {

Light_Low=Light_Right;

Light_Direction=3;

if (Light_Mid < Light_Left)

Light_High=Light_Left;

else

Light_High=Light_Mid; } */

/* Check if large differences in high and low readings – which would indicate a true light source as opposed to noise */

if (brightest < LightBuffer ) {

beep();

Light_Flag=1;}

}

/*check_Heat() { //Unused subroutine

int Heat_High;

int Heat_Low;

int HeatBuffer=90;

Heat_Left=analog(6);

Heat_Mid=analog(7);

Heat_Right=analog(8);

//If Left is brightest

if ((Heat_Left > Heat_Mid) && (Heat_Left > Heat_Right)) {

Heat_High=Heat_Left;

Heat_Direction=1;

if (Heat_Mid > Heat_Right)

Heat_Low=Heat_Right;

else

Heat_Low=Heat_Mid; }

//If Middle is brightest

if ((Heat_Mid > Heat_Left) && (Heat_Mid > Heat_Right)) {

Heat_High=Heat_Mid;

Heat_Direction=2;

if (Heat_Left > Heat_Right)

Heat_Low=Heat_Right;

else

Heat_Low=Heat_Right; }

//If Right is brightest

if ((Heat_Right > Heat_Mid) && (Heat_Right > Heat_Left)) {

Heat_Direction=3;

Heat_High=Heat_Right;

if (Heat_Mid > Heat_Left)

Heat_Low=Heat_Left;

else

Heat_Low=Heat_Mid; }

//Check if large differences in high and low readings - see light source

if ((Heat_High - Heat_Low) > HeatBuffer )

Heat_Flag=1;

} */

void check_IR() {

IR_Left=analog(16);

IR_Mid=analog(18);

IR_Right=analog(20);

// This code effectively filters out erroneous distance data.

if (IR_Left < 75 )

IR_Left=IR_Left;

if (IR_Mid < 75 )

IR_Mid=IR_Mid;

if (IR_Right < 75 )

IR_Right=IR_Right;

// This code determines the closest and farthest IR sensor readings

//If Left is brightest

if ((IR_Left < IR_Mid) && (IR_Left < IR_Right)) {

IR_Low=IR_Left;

IR_Direction=1;

if (IR_Mid < IR_Right) {

IR_High=IR_Right;

IR_moveaway=3;}

else {

IR_High=IR_Mid;

IR_moveaway=2; } }

//If Middle is brightest

if ((IR_Mid < IR_Left) && (IR_Mid < IR_Right)) {

IR_Low=IR_Mid;

IR_Direction=2;

if (IR_Left < IR_Right) {

IR_moveaway=3;

IR_High=IR_Right; }

else {

IR_moveaway=1;

IR_High=IR_Left; }}

//If Right is brightest

if ((IR_Right < IR_Mid) && (IR_Right < IR_Left)) {

IR_Direction=3;

IR_Low=IR_Right;

if (IR_Mid < IR_Left) {

IR_moveaway=1;

IR_High=IR_Left; }

else {

IR_moveaway=2;

IR_High=IR_Mid; }}

}

void decide() {

int dir=180;

int t=100000;

if (Light_Flag==1) {

if ( Light_Mid < 16 ){

while(t>100) {

beep();

t=t-1;}

shoot(); }

/* Compass and its proposed subroutine unused due to erratic data received from it during testing */

/* if ((abs_dir-compass())>150){

while (dir>15){

servo1=2750;

servo2=2750;

init_expbd_servos(1);

dir=abs_dir-compass();

if (dir 50) {

servo1=3000;

servo2=2750;

init_expbd_servos(1);

sleep(1.0); }

if (IR_High > 50) {

move_away(IR_moveaway); }

else {

move_towards(IR_Direction); } } }

void move_away(int direction) {

if (direction==1) {

servo1=3000;

servo2=2750;

init_expbd_servos(1);

//printf("Motor moves away from left");

}

if (direction==2) {

servo1=1000;

servo2=3000;

init_expbd_servos(1);

//printf("Motor moves away from forward");

}

if (direction==3) {

servo1=1000;

servo2=0;

init_expbd_servos(1);

//printf("Motor moves away from right");

}

sleep(0.1);

}

void move_towards(int direction) {

if (direction==1) {

servo1=1000;

servo2=0;

init_expbd_servos(1);

//printf("Motor towards left");

}

if (direction==2) {

servo1=3000;

servo2=0;

init_expbd_servos(1);

sleep(0.1);

//printf("Motor towards forward");

}

if (direction==3) {

servo1=3000;

servo2=2750;

init_expbd_servos(1);

//printf("Motor towards right");

}

sleep(0.1);

}

void move_towards_light(int direction) {

if (direction==1) {

servo1=1000;

servo2=0;

init_expbd_servos(1);

//printf("Motor towards left");

}

if (direction==2) {

servo1=3000;

servo2=0;

init_expbd_servos(1);

//printf("Motor towards forward");

}

if (direction==3) {

servo1=3000;

servo2=2850;

init_expbd_servos(1);

//printf("Motor towards right");

}

sleep(0.005);

}

void shoot (){

current_time=seconds();

time_difference= current_time - time_past;

if (time_difference > 20.0000 ) {

beep();

servo1=2555;

servo2=2555;

servo3=1000;

init_expbd_servos(1);

sleep(1.0);

beep();

servo3=2000;

init_expbd_servos(1);

sleep (1.0);

servo3=1000;

init_expbd_servos(1);

sleep (5.0);

}

servo1=3000;

servo2=2850;

init_expbd_servos(1);

sleep(1.4);

/*while(dir ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download