Abstract



“THE BOREDOM DESTROYER”A Desk Snack Food Launcher for Bored StudentsJessica Mylynne Shaw4/22/2014University of Florida EEL 5666 – IMDL –Report FinalInstructors: A. Antonio Arroyo, Eric M. Schwartz TAs: ??Josh Weaver, Andy Gray, Nick Cox, Daniel FrankTable of Contents TOC \o "1-3" \h \z \u Abstract PAGEREF _Toc385916301 \h 4Executive Summary PAGEREF _Toc385916302 \h 5Introduction PAGEREF _Toc385916303 \h 6Integrated System PAGEREF _Toc385916304 \h 6Mobile Platform PAGEREF _Toc385916305 \h 7Actuation PAGEREF _Toc385916306 \h 8Sensors PAGEREF _Toc385916307 \h 8Behaviors PAGEREF _Toc385916308 \h 11Conclusion PAGEREF _Toc385916309 \h 12Documentation PAGEREF _Toc385916310 \h 13Appendices PAGEREF _Toc385916311 \h 14IR Sensor Configuration Code (Arduino Language): PAGEREF _Toc385916312 \h 14Motor Configuration Code (Arduino Language): PAGEREF _Toc385916313 \h 15Servo Configuration Code (Arduino Language): PAGEREF _Toc385916314 \h 18LED Configuration Code (Arduino Language): PAGEREF _Toc385916315 \h 19Obstacle Avoidance plus Edge Avoidance Code (Arduino Language): PAGEREF _Toc385916316 \h 19Both Avoidances plus Face Detection Serial Code (Arduino Language): PAGEREF _Toc385916317 \h 23Face Detection (Using webcam)(Python Language): PAGEREF _Toc385916318 \h 27Face Detection plus Serial Communication (Using webcam) (Python Language): PAGEREF _Toc385916319 \h 29Wiring Diagram: PAGEREF _Toc385916320 \h 31AbstractFor students, it is always difficult at times to stay focused for long periods and so need breaks between long study sessions. The Boredom Destroyer would interact with the students as a snack food launcher and give them a fun break from school work. It would be turned on with a switch, roam around searching for a face while performing obstacle and edge avoidance; then once a face is found, it would then let the user know to move up or back, and finally when the student is in the ready position the robot will shoot. When turned on, it will move around doing obstacle avoidance with the use of three infrared (IR) sensors on the bottom platform pointing outward and table edge avoidance using two IR sensors point downward. The robot will use face detection using an IP wireless camera and OpenCV to determine the size of the user’s face to judge distances and then signal to the user to move closer or back up using three different LEDs till the user is in the right position. This report goes into the different parts of the robot and what struggles had to be overcome to make the robot functional.Executive Summary?The Boredom Destroyer is a snack food launching robot which uses a combination of sensors to perform its tasks. This project began with developing a Solidworks model to determine the layout of the platform, sensors, and firing mechanism. After a time of developing, it was found that doing a two level platform would be the best; the bottom level containing the camera, hardware, and majority of the wiring and the top level providing plenty of room for a firing mechanism of any kind.From the Solidworks diagram, the platform began to be built using a T-Tech machine to cut out the parts out of wood. Then the hardware was also ordered and obtained, adding more weight to the platform; this allowed for less concern that the robot would move when the firing mechanism is activated. As the parts began to be put together, the robot began to take shape ready for movement.To move the robot, 75:1 Metal Gearmotor DC motors are used, paired with brackets to fix the motors to the bottom of the bottom platform and wheels with shaft adapters so there is no need for hubs. To control the motors, a dual motor driver controller was used, providing control of the motors’ speed and direction for different situations. To do the obstacle and edge avoidance, five IR sensors were used: three pointed to the front of the robot on the top of the bottom platform at different angles and two pointed downward on the bottom of the bottom platform. These sensors work with the special sensor of the IP wireless camera to look for faces using OpenCV. As the robot roams using the obstacle and edge avoidance, the camera will search for a face. Once a face is detected, the robot will be told to stop and then sent character using serial to know what the camera is detecting. In this state the robot will then direct the user to move either closer, farther, or in the correct position using three different LEDs. When the user is in the correct position, the servo will then set off the trigger, causing a cheese ball to be fired from the cannon at the user’s face.When it came to the firing mechanism, there were a few different ideas that were bounced around. The initial idea that was a miniature catapult since that would provide the ability to have a variety of snack foods to be launched. After time, it was found that it would be difficult mechanically to automate and would be costly both time wise and fund wise. Through some testing, it was found that compressed air though a PVC pipe was enough to propel cheese balls with ease. This changed the firing mechanism design to be an air-cannon instead of a catapult. The next thing to do was to automate the trigger. The simplest and easiest method found was to create a joint system connected to the trigger and a servo. As the servo rotates, the trigger of the compressed air will be pulled sending a blast of air through tubing to the PVC pipe to propel a cheese ball out. Overall, this project as provided a challenge and the report below will go into more details about each part with figures and discuss the difficulties that came with them.IntroductionThe idea of the Boredom Destroyer came about when I was brain storming and remembered my brother and his friends throwing food at each other’s faces when they got bored. In my thought process, I figured it would be more impressive if you had a robot doing that for you as a study break. This then gave me this problem to solve, “For students, it is always difficult at times to stay focused for long periods and so need breaks between long study sessions. So the Boredom Destroyer would interact with the students as a snack food launcher and give them a fun break from school work.”With that, the objective of the project was to develop a snack food launcher that could detect a person’s face to then position either itself or the face for the optimal shooting range. This paper will go through the system’s hardware and software that will be used as well as discussion what issues arose and what was done to overcome them. Any specific code or diagrams not seen in the report will be in the Appendix.Integrated SystemThe system begins with the roaming around on a table in search of a face. When it detects a face, it would then let the user know to come forward or back up with an LED blink of two different colors (red and blue). Then finally when the student is in the ready position, the robot will then have a green LED blink and then launch the food at the student’s face using the servo and air can. Below in figure 1, the data flows from the X-bee serial and the IR sensors to the Arduino DUE board to then provide data for the motors that drive the robot and the servo for the firing mechanism. All of this data flow occurs within the Arduino loop making sure that the robot is continuously updating its data. Figure 1: Data flow to and from the Arduino DUE BoardFor the firing mechanism, there are two ideas in the works. The first is a catapult type, where a motor would be moved forward on a track to engage the system using a solenoid to help drive the food holder down to another solenoid that would lock the food holder in place till loaded. Then the motor solenoid would drive the motor back on the track from the food holder so not to ruin the motor when the food holder is then released. In this design, there would be buttons in the area where the food holder will hit when released (to let robot know it had fired) and when it is pulled back and locked in (ready for reload). The other launching mechanism idea is to have an air cannon type launcher where there would be a pressured air container that would have a trigger get pulled so to fire the food with air. The air cannon idea that was decided after looking at cost and doing some easy tests that proved the simplicity of the design.The microcontroller board that will be used is the Arduino DUE, to allow for a maximum number of ports and pins for any more functions that may be included later on in the robot’s design. The board was also easier to program compared to other options, providing more time to focus on programming the camera and getting the firing mechanism to work. There will also be a wireless router with Xbees which will connect the board and camera to a computer, to allow for more processing speed and programming capabilities with regard to OpenCV. The power source is LiPo two cell 7.4 volt battery. This battery is perfect in providing enough power for the two motors, the camera (after adding a voltage converter near the source), the servo, and the board (which provides 5 volts of power for the IR sensors). The motors that were obtained are 75:1 Metal Gearmotor Brushed DC Motors which provide enough torque for the kind of platform that was made and fit within the power range that I was looking for. They also include encodes if ever I wanted to monitor the position of the motors for a later project.Mobile PlatformThe platform has been difficult in figuring out how to efficiently use space as well as providing a ridged set up for the firing mechanism. The wiring and microcontroller placing for the firing mechanism was difficult when the battery was included. It then became how do I put the battery on the platform so it is not in the way of the boards, motors, and caster wheel; as well as how do I have its orientation so it can include a switch for the battery? Figure 2: One of the pins used in the platform connectors to allow for the ability to access the hardware and wiring componentsAfter doing rearranging of the boards, putting in more holes, and the battery with some rewiring, everything was able to fit (the wiring diagram has been included in the Appendix). Most of the wiring, hardware, and boards went on the top of the bottom of the platform, while the motors and the two IR sensors when on the bottom of the bottom platform with the caster wheel. Then the question was where to put the firing mechanism? The easiest solution was to have another layer of platform. So after modeling connectors, it was just a matter of how can I get back to the bottom platform components if I need to? Figure 2 shows just one of the three pins that help to keep my platform together but also removable to give me access to the other components. The challenge here was doing rapid prototyping of if these pins were lost. I started experimenting and found that looping in wire can work just as well and it is much easier to deal with wire especially in the tighter pin. The platform definitely stretched my material use skills. I also had to experiment with different hardware as well as different ways to organize wires which was new to me as a mechanical engineer. ActuationThe scope for the actuation of the platform motors was to get the motors running as soon as possible so that obstacle avoidance could be implemented. There are different types of actuation to where you can have anywhere between one to four motors depending on your configuration. I went with a basic three wheel set up where the front two wheels have motors and the back wheel is a caster wheel as seen below in figure 3. This was a configuration I was used to from my past projects and it was cost effective in that I only had to get two motors.Figure 3: Actuation setupThe embarrassing lesson I learned from this was that on top of needing the motors and wheels, I needed a motor controller to actually get the motors controlled with enough power and proper logic; and I did not learn this until I really needed to get it programed and wired. It was different actually programming the motors in the dual motor controller that I got. I had never hardwired such a thing before, let alone programed it. After much research and trial and error, I did get it working. The software was interesting in how you had to have two digital outputs to have the motors change direction. That took the most time to make sure the wheels were going in the direction they were supposed to. My code to configure my motors can be found in the Appendix. For the torque speed, I have had to experiment with that variable as well. At first I have had my PWM value set at 200. I did not realize that was a high value, all I knew was that the motors moved when the values were that high. After doing testing in front of my peers, it was seen to be very fast, especially if I was going to be doing edge detection. So I changed my PWM values to 128, a more moderate speed. Before final demo, it may get even smaller but until then this value has been working well.SensorsFor the edge detections, there are two IR sensors in use on the bottom of the robot. For obstacle avoidance, three IR sensors will be in use on the top of the bottom platform the robot pointed outward at different angles. Figure 4 below shows the setup of the five IR sensors on the robot. IR sensors 1 to 3 (right to left) are the obstacle avoidance and 4 to 5 (left to right) on bottom do the edge avoidance in reference to the IR senor configuration code in the Appendix. 416156814327505005136525014446254004423926036449010012766695483235200213652504832353003Figure 4: IR Senor Setup To test the obstacle avoidance it was simple enough: put obstacles in the path of the robot at different angles and see if the robot would not hit them. The robot did do its job at the faster PWM that I had before but it was quite jerky. So in reducing the PWM values, the robot became much smoother in avoiding obstacles. It also allowed for the robot to have more time to recognize edges so to not fall off the table. Figure 5 shows the robot on a piece of table that is three inches off the ground to provide a perfect testing area for edge detection. This way, if the robot does not detect the edge in time, it only fall a few inches instead of feet. Overall, the edge avoidance has been working well.Figure 5: Robot on top of 3 inch off the ground table to test edge avoidanceFigure 6: D-Link IP Wireless Camera setup underneath cannonThe robot uses face detection using an IP wireless camera with OpenCV to determine the distance of the user’s face is from the robot. The Python program for the face detection (in the Appendix), makes a box around the face and then from that box it calculates a distance. As that box gets smaller, so does the distance telling the system the user is farther away; box gets larger the user is closer. So to be able to use the camera, Xbee wireless communicators are needed for both the computer and the Arduino DUE board. The real challenge in this part of the robot was OpenCV. After weeks of trying Visual Studio, Processing, and Eclipse on Windows, all it took was uninstalling and then reinstalling Visual Studio to get OpenCV to work. But then when I tried to get my face detection program to work on Visual Studio, every time it would fail. After many weeks of hitting my head against the wall with that problem, I switched from Windows to Ubuntu to use Python. Now I had no experience in Linux or Python at all but since I was getting nowhere with Visual Studios (most likely due to it not being set up properly and I did not know Visual Studio well enough to dive in and fix its properties), I was in about the same situation either way. After getting Ubuntu successfully put on my newly partition hard drive, I then used Python to try installing OpenCV and it worked! It was a funny experience in that I knew really nothing about the Linux or Python though I finally got that working a lot sooner than Windows and C++ which I knew more of. With the serial communication working using pySerial, the Python face detection program would send characters to the Xbees to the Arduino DUE to process and determine whether there is a face and where it is. Although OpenCV has finally been working, it is unfortunate that the camera has been stubborn in getting called in the Python program though the camera has been successfully set up on a router. I will be continuously working on this issue till Media Day but if it does not work, I’ll try to get it working with my computer webcam. If I can get IP camera working, I will also work on getting it configured and better positioned like in figure 6 above.BehaviorsOne of the behaviors of the robot is obstacle and edge avoidance. This was the first part of the robot to accomplish in the scope of behaviors. The most difficult part about this behavior was the configuration of the sensors. Not that it was hard, it was just tedious because you have to reconfigure every time you move to a new location with new and different lighting. This code can be found in the in “Obstacle Avoidance plus Edge Avoidance” section of the Appendix.Another behavior is to shoot snack food at people. This was the simplest code wise and so happened next in the scope of the behaviors. And the configuration of this behavior was fun in that many of my friends got the chance to be the snack food catchers as I figured out what would be enough air to move the cheese ball out of the PVC pipe. For this configuration code, look in the “Servo Configuration Code” section of the Appendix. The last and hardest behavior is to detect faces and alert the user to their distance from the robot. This was a mix of Arduino and Python code. The Arduino code would combine all of the behaviors and the Python code would send the characters to the Xbees tell the DUE board what behavior to perform in terms of which LED to light up and/or fire the cannon. This code is a combination of “Both Avoidances plus Face Detection Serial Code” and “Face Detection plus Serial Communication” sections in the Appendix. ConclusionRealistically, I think I’ve accomplished a lot over the course of this semester. I was able to accomplish obstacle avoidance after a lot of hard work, and then edge avoidance came easy after it. I eventually got OpenCV face detection to work after a lot of frustration and having to switch between different operating systems and programming languages. The cannon was a pleasant surprise after all of the software issues and being told that the servo that was believed to be too weak to pull the trigger of the air can. And after a lot of soldering, all of my circuits are more permanent and I was able to make a LED notification board. The LED notification board was an adventure in itself, really teaching me the value of working in the day. It was not working for a while I could not figure out why until I realized that I had never completed the circuit sending it to ground. I laughed at myself for hours on that one.OpenCV and all of its issues was the limitation for me as well as programming in general. As a mechanical engineer, I do know that I will need to do some programming but I really only knew the basics of programing and I had not programmed anything to this scale in many years. It really was a struggle for me to program all that I have but I think this project has helped me to brush up my skills in programming and gave me the chance to stretch myself. In some ways, maybe even too much at times. I think the cannon exceeded my expectations, especially since many of my more experienced peers told me there was no way for the servo to be able to pull down the air can trigger. The software is the area I would want to improve on greatly. When I get the chance, I would like to go back and improve my programs for face detection and really understand it more. Be able to have all of the capabilities I originally wanted to have in the robot.With that, if I could start the project over and change anything, I would start way earlier on programming and would have been more flexible in giving other routes a chance even if I have little to no knowledge about them. Any enhancements that would be made would be in the software since that is where I am lacking. With more time though, I am sure I could learn what skills I would need to be more able to pull off the tasks I originally wanted. Overall it was a challenge and I hope to continue to develop the software years to come.DocumentationTABLE I: Part list with link to website informationPartInformation/CostArduino DUE Microcontroller DCS-930L IP Wireless Camera GP2D120XJ00F IR Sensors Metal Gearmotor Brushed DC Motors Brackets Pair with 4mm Shaft Adapters Motor Controller Module Explorer USB Shield Module for DUE 7.4V 5000mAH battery & charger kit Cell LiPo Battery Monitor Rotation Servo HSR-1425CR WheelFound one to fit my robot at LOWESPVC PipeFound a pipe and end that was able to use cheese balls at LOWESCompressed Air CanFound one to fit my robot at WalmartTutorials and Documentation for Ubuntu, Python, and OpenCVA. Mordvintsev and Abid K. (2014, March 20). OpenCV-Python Tutorials Documentation. [Online]. Available: . Liechti. (2013). PySerial Documentation. [Online]. Available: (2013, March 10). Installing PySerial Under Linux: Serial Begin. [Online]. Available: Sensor Configuration Code (Arduino Language)://Code of verify IR sensors are working //and get values from thier out put//int var1 = 0; //for filtering data if needed//int var2 = 0;//int var3 = 0; //int var4 = 0; //int var5 = 0; //int var6 = 0;int sensorValue1 =0;int sensorValue2 =0;int sensorValue3 =0;int sensorValue4 =0;int sensorValue5 =0;// the setup routine runs once when you press reset:void setup() { // initialize serial communication at 9600 bits per second: Serial.begin(9600);}void loop() {// varibles used below for filtering data if needed // int var1 = analogRead(A1);// delay(1); // delay in between reads for stability// // int var2 = analogRead(A2);// delay(1); // delay in between reads for stability// // int var3 = analogRead(A3);// delay(1); // delay in between reads for stability// // int var4 = analogRead(A1);// delay(1); // delay in between reads for stability// // int var5 = analogRead(A2);// delay(1); // // int var6 = analogRead(A3);// delay(1); // delay in between reads for stability// sensorValue1 = analogRead(A7); //move away when >500 Serial.print(sensorValue1); Serial.print(','); delay(250); // delay in between reads for stability sensorValue2 = analogRead(A8); //move away when >550 Serial.print(sensorValue2); Serial.print(','); delay(250); // delay in between reads for stability sensorValue3 = analogRead(A9); //move away when >530 Serial.println(sensorValue3); Serial.print(','); delay(250); // delay in between reads for stability sensorValue4 = analogRead(A10); //move away when less than 600 Serial.println(sensorValue4); Serial.print(','); delay(250); // delay in between reads for stability sensorValue5 = analogRead(A11); //move away when less than 600 Serial.println(sensorValue5); delay(250); // delay in between reads for stability}Motor Configuration Code (Arduino Language)://This code tests to see if motors are calibrated correctly and that the drives match reality// ————————————————————————— Motorsint motor_left[] = {28, 29}; //digital pins for left motorint motor_right[] = {22, 23}; //digital pins for right motorint motor_leftPWM = 2; //PWM pins for left motorint motor_rightPWM = 3; //PWM pins for right motorint ledPin = 13; // LED connected to digital pin 13// ————————————————————————— Setupvoid setup() { //set serial communication at 9600 bits per sec Serial.begin(9600); // Setup motors as outputs int i; for(i = 0; i < 2; i++){ pinMode(motor_left[i], OUTPUT); pinMode(motor_right[i], OUTPUT); pinMode(ledPin, OUTPUT); pinMode(motor_leftPWM, OUTPUT); pinMode(motor_rightPWM, OUTPUT);}}// ————————————————————————— Drivevoid motor_stop(){ digitalWrite(motor_left[0], LOW); digitalWrite(motor_left[1], LOW); analogWrite(motor_leftPWM, 0); digitalWrite(motor_right[0], LOW); digitalWrite(motor_right[1], LOW); analogWrite(motor_rightPWM, 0); delay(25);}void turn_left(){ digitalWrite(motor_left[0], HIGH); digitalWrite(motor_left[1], LOW); analogWrite(motor_leftPWM, 128); digitalWrite(motor_right[0], HIGH); digitalWrite(motor_right[1], LOW); analogWrite(motor_rightPWM, 128);}void turn_right(){ digitalWrite(motor_left[0], LOW); digitalWrite(motor_left[1], HIGH); analogWrite(motor_leftPWM, 128); digitalWrite(motor_right[0], LOW); digitalWrite(motor_right[1], HIGH); analogWrite(motor_rightPWM, 128);}void drive_forward(){ digitalWrite(motor_left[0], LOW); digitalWrite(motor_left[1], HIGH); analogWrite(motor_leftPWM, 128); digitalWrite(motor_right[0], HIGH); digitalWrite(motor_right[1], LOW); analogWrite(motor_rightPWM, 128);}void drive_backward(){ digitalWrite(motor_left[0], HIGH); digitalWrite(motor_left[1], LOW); analogWrite(motor_leftPWM, 128); digitalWrite(motor_right[0], LOW); digitalWrite(motor_right[1], HIGH); analogWrite(motor_rightPWM, 128);}// ————————————————————————— Loopvoid loop() {//tests to see if motors are calibrated correctly and that the drives match realatitydrive_forward();delay(1000);motor_stop();delay(1000);Serial.println("1");drive_backward();delay(1000);motor_stop();delay(1000);Serial.println("2");turn_left();delay(1000);motor_stop();delay(1000);Serial.println("3");turn_right();delay(1000);motor_stop();delay(1000);Serial.println("4");motor_stop();delay(1000);motor_stop();delay(1000);Serial.println("5");//to make sure the program is going through the loop if motors aren't movingdigitalWrite(ledPin, HIGH); // set the LED on delay(3000); // wait for a second digitalWrite(ledPin, LOW); // set the LED off delay(1000); // wait for a second}Servo Configuration Code (Arduino Language):// Servo Sweep to test servo movement with trigger#include <Servo.h> Servo myservo; // create servo object to control a servo // a maximum of eight servo objects can be created int pos; // variable to store the servo position void setup() { myservo.attach(9); // attaches the servo on pin 9 to the servo object // initialize serial communication at 9600 bits per second: Serial.begin(9600);} void loop() { char shoot = Serial.read(); if (shoot == 'a') { pos = 0; myservo.write(pos); // tell servo to go to position in variable 'pos' delay(600); pos = 180; myservo.write(pos); delay(400); pos= 90; myservo.write(pos); delay(1000); Serial.print("shot");}}LED Configuration Code (Arduino Language):/* Blink Tests to make sure the other LEDs are functioning */// Pin 13 has an LED connected on most Arduino boards.int led = 13;int red = 33; //33 is red, 35 is green, 37 is blueint green = 35; //33 is red, 35 is green, 37 is blueint blue = 37; //33 is red, 35 is green, 37 is bluevoid setup() { // initialize the digital pin as an output. pinMode(led, OUTPUT); pinMode(red, OUTPUT); pinMode(green, OUTPUT); pinMode(blue, OUTPUT);}void loop() { digitalWrite(led, HIGH); // turn the LED on (HIGH is the voltage level) digitalWrite(red, HIGH); digitalWrite(blue, HIGH); digitalWrite(green, HIGH); delay(1000); // wait for a second digitalWrite(led, LOW); // turn the LED off by making the voltage LOW digitalWrite(red, LOW); digitalWrite(blue, HIGH); digitalWrite(green, HIGH); delay(1000); // wait for a second}Obstacle Avoidance plus Edge Avoidance Code (Arduino Language): // ————————————————————————— Motorsint motor_left[] = {28, 29}; //digital pins for left motorint motor_right[] = {22, 23}; //digital pins for right motorint motor_leftPWM = 2; //PWM pins for left motorint motor_rightPWM = 3; //PWM pins for right motorint ledPin = 13; // LED connected to digital pin 13//-------------------------------------------- IR Sensorsint sensorValue1 = 0; //initializes all sensor valuesint sensorValue2 = 0;int sensorValue3 = 0;int sensorValue4 =0;int sensorValue5 =0;// ————————————————————————— Setupvoid setup() { Serial.begin(9600); //set serial communication at 9600 bits per sec // Setup motors as outputs int i; for(i = 0; i < 2; i++) { pinMode(motor_left[i], OUTPUT); pinMode(motor_right[i], OUTPUT); pinMode(ledPin, OUTPUT); pinMode(motor_leftPWM, OUTPUT); pinMode(motor_rightPWM, OUTPUT);}}// ————————————————————————— Driver settingsvoid motor_stop(){ digitalWrite(motor_left[0], LOW); digitalWrite(motor_left[1], LOW); analogWrite(motor_leftPWM, 0); digitalWrite(motor_right[0], LOW); digitalWrite(motor_right[1], LOW); analogWrite(motor_rightPWM, 0); delay(25);}void turn_left(){ digitalWrite(motor_left[0], HIGH); digitalWrite(motor_left[1], LOW); analogWrite(motor_leftPWM, 200); digitalWrite(motor_right[0], HIGH); digitalWrite(motor_right[1], LOW); analogWrite(motor_rightPWM, 200);}void turn_right(){ digitalWrite(motor_left[0], LOW); digitalWrite(motor_left[1], HIGH); analogWrite(motor_leftPWM, 200); digitalWrite(motor_right[0], LOW); digitalWrite(motor_right[1], HIGH); analogWrite(motor_rightPWM, 200);}void drive_forward(){ digitalWrite(motor_left[0], LOW); digitalWrite(motor_left[1], HIGH); analogWrite(motor_leftPWM, 200); digitalWrite(motor_right[0], HIGH); digitalWrite(motor_right[1], LOW); analogWrite(motor_rightPWM, 200);}void drive_backward(){ digitalWrite(motor_left[0], HIGH); digitalWrite(motor_left[1], LOW); analogWrite(motor_leftPWM, 200); digitalWrite(motor_right[0], LOW); digitalWrite(motor_right[1], HIGH); analogWrite(motor_rightPWM, 200);}// ————————————————————————— Loopvoid loop() { //if there is any doubt that the IR sensors are working, this provides a look sensorValue1 = analogRead(A7); //move away when >550 Serial.print(sensorValue1); Serial.print(','); delay(1); // delay in between reads for stability sensorValue2 = analogRead(A8); //move away when >600 Serial.print(sensorValue2); Serial.print(','); delay(1); // delay in between reads for stability sensorValue3 = analogRead(A9); //move away when >600 Serial.println(sensorValue3); Serial.print(','); delay(1); // delay in between reads for stability sensorValue4 = analogRead(A10); //move away when less than 690 Serial.println(sensorValue4); Serial.print(','); delay(1); // delay in between reads for stability sensorValue5 = analogRead(A11); //move away when Serial.println(sensorValue5); delay(1); // delay in between reads for stability //Obstacle avoidance using IR sensors if (sensorValue1 > 500) {drive_backward(); delay(250); turn_right(); Serial.println("turn right"); delay(500); } else if (sensorValue2 > 550) { drive_backward(); Serial.println("back up"); delay(1000); if (sensorValue2 <300 && sensorValue4>692) {turn_left(); delay(500);} else {turn_right(); delay(500);} } else if (sensorValue3 > 530) {drive_backward(); delay(250); turn_left(); Serial.println("turn left"); delay(500);} else if (sensorValue4 < 600) { drive_backward(); delay(500); if(sensorValue2 <300) {turn_right();} else {turn_left();} } else if (sensorValue5 < 600) { drive_backward(); delay(550); if(sensorValue2 <300) {turn_left();} else {turn_right();} } else {drive_forward(); Serial.println("forward");}}Both Avoidances plus Face Detection Serial Code (Arduino Language):// ————————————————————————— Motorsint motor_left[] = {28, 29}; //digital pins for left motorint motor_right[] = {22, 23}; //digital pins for right motorint motor_leftPWM = 2; //PWM pins for left motorint motor_rightPWM = 3; //PWM pins for right motorint ledPin = 13; // LED connected to digital pin 13//-------------------------------------------- IR Sensorsint sensorValue1 = 0; //initializes all sensor valuesint sensorValue2 = 0;int sensorValue3 = 0;int sensorValue4 =0;int sensorValue5 =0;//--------------------------------------------Servo/Cannon#include <Servo.h>Servo myservo;int pos;//--------------------------------------------LEDsint ledred = 33;int ledgreen = 35;int ledblue = 37;// ————————————————————————— Setupvoid setup() { Serial.begin(9600); //set serial communication at 9600 bits per sec myservo.attach(9); //attaches the servo on pin 9 pinMode(ledred, OUTPUT); pinMode(ledgreen, OUTPUT); pinMode(ledblue, OUTPUT); // Setup motors as outputs int i; for(i = 0; i < 2; i++) { pinMode(motor_left[i], OUTPUT); pinMode(motor_right[i], OUTPUT); pinMode(ledPin, OUTPUT); pinMode(motor_leftPWM, OUTPUT); pinMode(motor_rightPWM, OUTPUT);}}// ————————————————————————— Driver settingsvoid motor_stop(){ digitalWrite(motor_left[0], LOW); digitalWrite(motor_left[1], LOW); analogWrite(motor_leftPWM, 0); digitalWrite(motor_right[0], LOW); digitalWrite(motor_right[1], LOW); analogWrite(motor_rightPWM, 0); delay(25);}void turn_left(){ digitalWrite(motor_left[0], HIGH); digitalWrite(motor_left[1], LOW); analogWrite(motor_leftPWM, 128); digitalWrite(motor_right[0], HIGH); digitalWrite(motor_right[1], LOW); analogWrite(motor_rightPWM, 128);}void turn_right(){ digitalWrite(motor_left[0], LOW); digitalWrite(motor_left[1], HIGH); analogWrite(motor_leftPWM, 128); digitalWrite(motor_right[0], LOW); digitalWrite(motor_right[1], HIGH); analogWrite(motor_rightPWM, 128);}void drive_forward(){ digitalWrite(motor_left[0], LOW); digitalWrite(motor_left[1], HIGH); analogWrite(motor_leftPWM, 128); digitalWrite(motor_right[0], HIGH); digitalWrite(motor_right[1], LOW); analogWrite(motor_rightPWM, 128);}void drive_backward(){ digitalWrite(motor_left[0], HIGH); digitalWrite(motor_left[1], LOW); analogWrite(motor_leftPWM, 128); digitalWrite(motor_right[0], LOW); digitalWrite(motor_right[1], HIGH); analogWrite(motor_rightPWM, 128);}// ————————————————————————— Loopvoid loop() { char shoot = Serial.read(); sensorValue1 = analogRead(A7); //move away when >550 sensorValue2 = analogRead(A8); //move away when >600 sensorValue3 = analogRead(A9); //move away when >600 sensorValue4 = analogRead(A10); //move away when less than 690 sensorValue5 = analogRead(A11); //move away when if (sensorValue1 > 500) {drive_backward(); delay(250); turn_right(); Serial.println("turn right"); delay(500); } else if (sensorValue2 > 550) { drive_backward(); Serial.println("back up"); delay(1000); if (sensorValue2 <300 && sensorValue4>692) {turn_left(); delay(500);} else {turn_right(); delay(500);} } else if (sensorValue3 > 530) {drive_backward(); delay(250); turn_left(); Serial.println("turn left"); delay(500);} else if (sensorValue4 < 600) { drive_backward(); delay(500); if(sensorValue2 <300) {turn_right();} else {turn_left();} } else if (sensorValue5 < 600) { drive_backward(); delay(550); if(sensorValue2 <300) {turn_left();} else {turn_right();} } else {drive_forward(); Serial.println("forward");}//gets characters from serial from the IP Cameraif (shoot == 'g' || shoot == 'b' || shoot == 'a' || shoot == 'r') { Serial.println("in loop"); if(shoot == 'b' || shoot == 'a' || shoot == 'r') {motor_stop(); if (shoot == 'b') {digitalWrite(ledred, HIGH); delay(1000); // wait for a second digitalWrite(ledred, LOW); delay(1000); // wait for a second Serial.println("back up"); } else if (shoot == 'a') {digitalWrite(ledblue, HIGH); delay(1000); // wait for a second digitalWrite(ledblue, LOW); delay(1000); // wait for a second Serial.println("come forward"); } else if (shoot == 'r') {digitalWrite(ledgreen, HIGH); delay(1000); // wait for a second digitalWrite(ledgreen, LOW); delay(1000); // wait for a second Serial.println("ready"); delay(2000); pos = 0; myservo.write(pos); // tell servo to go to position in variable 'pos' delay(600); pos = 180; myservo.write(pos); delay(400); pos= 90; myservo.write(pos); delay(1000); Serial.print("shot"); }} //Serial.print("past serial"); //if there is any doubt that the IR sensors are working, this provides a look// Serial.print(sensorValue1);// Serial.print(',');// delay(1); // delay in between reads for stability// Serial.print(sensorValue2);// Serial.print(',');// delay(1); // delay in between reads for stability// Serial.println(sensorValue3);// Serial.print(',');// delay(1); // delay in between reads for stability// Serial.println(sensorValue4);// Serial.print(',');// delay(1); // delay in between reads for stability// Serial.println(sensorValue5);// delay(1); // delay in between reads for stability}else if(shoot == 's'){motor_stop();}}Face Detection (Using webcam)(Python Language):#!/usr/bin/env pythonimport numpy as npimport cv2# local modulesfrom video import create_capturefrom common import clock, draw_strhelp_message = '''USAGE: facedetect.py [--cascade <cascade_fn>] [--nested-cascade <cascade_fn>] [<video_source>]'''def detect(img, cascade): rects = cascade.detectMultiScale(img, scaleFactor=1.3, minNeighbors=4, minSize=(30, 30), flags = cv2.CASCADE_SCALE_IMAGE) if len(rects) == 0: return [] rects[:,2:] += rects[:,:2] return rectsdef draw_rects(img, rects, color): for x1, y1, x2, y2 in rects: cv2.rectangle(img, (x1, y1), (x2, y2), color, 2)if __name__ == '__main__': import sys, getopt print help_message args, video_src = getopt.getopt(sys.argv[1:], '', ['cascade=', 'nested-cascade=']) try: video_src = video_src[""] except: video_src = "" args = dict(args) cascade_fn = args.get('--cascade', "../../data/haarcascades/haarcascade_frontalface_alt.xml") nested_fn = args.get('--nested-cascade', "../../data/haarcascades/haarcascade_eye.xml") cascade = cv2.CascadeClassifier(cascade_fn) nested = cv2.CascadeClassifier(nested_fn) cam = create_capture(video_src, fallback='synth:bg=../cpp/lena.jpg:noise=0.05') while True: ret, img = cam.read() gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) gray = cv2.equalizeHist(gray) t = clock() rects = detect(gray, cascade) vis = img.copy() draw_rects(vis, rects, (0, 255, 0)) for x1, y1, x2, y2 in rects: roi = gray[y1:y2, x1:x2] vis_roi = vis[y1:y2, x1:x2] subrects = detect(roi.copy(), nested) draw_rects(vis_roi, subrects, (255, 0, 0)) dt = clock() - tx = x2-x1draw_str(vis, (20, 40), 'Distance: %.1f' % (x)) draw_str(vis, (20, 20), 'time: %.1f ms' % (dt*1000))cv2.imshow('facedetect', vis) if 0xFF & cv2.waitKey(5) == 27: break cv2.destroyAllWindows()Face Detection plus Serial Communication (Using webcam) (Python Language):#!/usr/bin/env pythonimport numpy as npimport cv2# local modulesfrom video import create_capturefrom common import clock, draw_strhelp_message = '''USAGE: facedetect.py [--cascade <cascade_fn>] [--nested-cascade <cascade_fn>] [<video_source>]'''def detect(img, cascade): rects = cascade.detectMultiScale(img, scaleFactor=1.3, minNeighbors=4, minSize=(30, 30), flags = cv2.CASCADE_SCALE_IMAGE) if len(rects) == 0: return [] rects[:,2:] += rects[:,:2] return rectsdef draw_rects(img, rects, color): for x1, y1, x2, y2 in rects: cv2.rectangle(img, (x1, y1), (x2, y2), color, 2)if __name__ == '__main__': import sys, getopt print help_message args, video_src = getopt.getopt(sys.argv[1:], '', ['cascade=', 'nested-cascade=']) try: video_src = video_src[0] except: video_src = 0 args = dict(args) cascade_fn = args.get('--cascade', "../../data/haarcascades/haarcascade_frontalface_alt.xml") nested_fn = args.get('--nested-cascade', "../../data/haarcascades/haarcascade_eye.xml") cascade = cv2.CascadeClassifier(cascade_fn) nested = cv2.CascadeClassifier(nested_fn) cam = create_capture(video_src, fallback='synth:bg=../cpp/lena.jpg:noise=0.05') while True: ret, img = cam.read() gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) gray = cv2.equalizeHist(gray) t = clock() rects = detect(gray, cascade) vis = img.copy() draw_rects(vis, rects, (0, 255, 0)) for x1, y1, x2, y2 in rects: roi = gray[y1:y2, x1:x2] vis_roi = vis[y1:y2, x1:x2] subrects = detect(roi.copy(), nested) draw_rects(vis_roi, subrects, (255, 0, 0)) dt = clock() - tx = x2-x1import serialser = serial.Serial('/dev/ttyUSB1', 9600, timeout=1)if (x <105) and (x>95):draw_str(vis, (20, 40), 'Perfect')ser.write('r')elif x >105:draw_str(vis, (20, 60), 'Move Back')ser.write('b')elif x < 95:draw_str(vis, (20, 80), 'Come Forward')ser.write('a') '''draw_str(vis, (20, 20), 'time: %.1f ms' % (dt*1000))'''draw_str(vis, (20, 20), 'Distance: %.1f' % (x)) cv2.imshow('facedetect', vis) if 0xFF & cv2.waitKey(5) == 27: break cv2.destroyAllWindows()Wiring Diagram: ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download