THE P3-AT MOBILE ROBOT - Webs



THE P3-AT MOBILE ROBOT

[pic]

I. Introduction

Ever wonder why it seems at times that you look like you have to deal with the many aspects of life. For example, wouldn’t life be less stressful if a robot could bring you that fresh cup of coffee while you continue to work those long, tiring hours in your office doing life biggest consequence - “WORK”? Well, this solution has been added to the huge pile of research and has become successfully put into action in a variety of ways. For instance, my team member, Brandon Morton, and I have luckily been chosen to take part in this research experience by helping fellow graduate students at Stevens to implement the OpenCV library with the ARIA software that was developed for the Pioneer 3 All- Terrain (P3-AT) robot model. Before we begin to go further in detail about our goal that we are trying to achieve, we would first need to date back to the development and history of previous Pioneer mobile robot platforms so that you would have a better understanding of how the P3-AT model evolved.

The Pioneer Reference Platform

The P3-AT model robot, the one our team would be modifying software applications for, was developed by an incorporation called MobileRobots. This robot’s platform sets the base for intelligent mobile robots by containing all of the standard accessories for sensing and navigation in a real-world surrounding. In addition, the P3-AT has become sort of like a reference blueprint in many project circumstances, including a wide variety of US Defense Advanced Research Projects Agency (DARPA) funded investigations. It comes complete with a strong aluminum boy, well-balanced drive system (four-wheel skid-steer), reversible DC motors, motor-control and drive electronics, motion encoders, and battery power. The features listed above are all handled by an onboard controller and mobile-robot server software. The P3-AT also has a host of intelligent robot-control client software programs and an environment for application development. In this environment, our primary goal is to develop software for the robot using the OpenCV library that would be used as an object-tracking system. The P3-AT originally had some software called ActivMedia Robotics Interface for Applications (ARIA), located inside the Linux-based operating system on its server. Brandon and I are basically trying to modify and enhance this main software to meet our desirable object-tracking feature.

II. The Pioneer Legacy

[pic]

A. The Pioneer 1

Introduced in the summer of 1995 in Montreal, the original robot platform was called the Pioneer 1.

This model was basically used indoors on hard, level surfaces. It came equipped with solid rubber tires, a two-wheel differential, and a reversible drive system with a caster located in the rear to help maintain balance. In addition, it also contained a single-board 68HC11-based robotic controller and Pioneer Server Operating System (PSOS) software. One of the features that the Pioneer 1 could do was detect ranges with its onboard sonar system. Not only was the Pioneer 1 robot model easily affordable, but because of its high performance, it gave a number of researchers and developers direct

access to a real-world, mobile robotic platform. Also, it did not take long before Pioneer’s open architecture became the appreciated platform for the makings of a diverse field of other possible robotics software environments.

B. The Pioneer-AT

Presented in the summer of 1997, the Pioneer AT was very identical to the Pioneer 1 functionality-wise and programmatically-wise but had a four-wheeldrive, skid-steer. This model robot was basically a rough terrain enhancement of its predecessor. The Pioneer AT, or Pioneer All-Terrain, was used in uneven indoor/outdoor places including loose and tough terrain. The only main difference between the Pioneer and Pioneer AT was the drive systems. Everything else was pretty much the same including the sonar arrays, controllers, and accessories.

C. Pioneer 2: The Second Generation

Right after the first generation of Pioneer model robots, the second generation jump started. The opening debut of the Pioneer 2 prototype was introduced in the fall of 1998 through the summer of 1999. As you can probably guess, the Pioneer 2 was a better enhancement of the Pioneer 1 legacy. However, while still retaining many components of its originator, it offered more expansion options. One feature included a client PC onboard the robot. Furthermore, there were many AcivMedia Robotics Pioneer 2 model robots. Some of the models that were part of this group are listed below:

Pioneer 2-DX Pioneer 2-DXe Pioneer 2-AT

[pic] [pic] [pic]

Some models, such as the Pioneer 2-DX, -DE, -Dxe, -DXf, and –AT, used a 20 MHz Siemens 88C166- based controller. On top of that, they also had stand-alone motor-power and sonar controller boards for unpredictable operating environments. Like the Pioneer 1, some models such as the Pioneer2-DX, DXe, -DXf, and -CE, were two-wheel, differential-drive robots. One of the differences between these

two generations was that the second one sported a more holonomic shape, bigger, wheels, and stronger motors for improved indoor experiences.

The P3-AT’s Predecessor

The P3-AT mobile robot that Brandon and I are working on also had a predecessor. As you may have already guessed, yes it is the Pioneer 2-AT (P2-AT). This robot, like the P3-AT, was four-wheel drive with four stand-alone motors and drivers. However, unlike its Pioneer AT predecessor, it came with a stall-discovering system and inflatable pneumatic tires. The P2-AT also contained metal wheels so that

it could easily handle operations in rough terrains. Two other features that it could do were carry nearly 66 pounds of cargo and climb a 60-percent grade. How neat!

D. The 3rd Generation: Pioneer 3 and recent Pioneer 2-DX8, -AT8

and Plus Mobile Robots

In the summer of 2002, two new Pioneer 2 models appeared and two more at the start of 2003. The Pioneer 3 was presented in the summer of 2003. Each model used a distinct [P2H8] controller based on the Hitachi H8S microprocessor. In addition, they had new control systems software, called Advanced Robot Control & Operation Software (ARCOS), along with I/O expansion characteristics. The 2-Plus and Pioneer 3 mobile robots had more powerful and stronger motor/power systems so that they could have enhanced navigational handling and cargo hold.

III. Software

A. Client

The P3-AT platform works as the server within a client-server setting: their controllers operate the low-level elements of mobile robotics, including the robot’s acceleration and paths over inconsistent terrain. Adding on, they obtain sensor readings, such as from the sonar, and operate the attached components of the robot. A PC connection, however, is needed to complete the client-server structure. The user would have to run the software on his or her computer connected with the robot’s controller via the HOST serial link. Running different robot servers at the same time using the same client is one of the benefits of the client-server architecture. Another advantage is that many clients can also share responsibility for operating an individual server, which allows experimentation in linked exchanges of information to come into play. Furthermore, it also permits planning and control of the robot.

B. ARIA and ARNetworking

As mentioned in the second opening paragraph, the P3-AT has an important type of software application, called ARIA, which is used to run the many different programs that the robot operates on. Now, let’s get into further detail about this software. As a C++ based open-source creative environment, ARIA has a strong client-side interface to many robotics systems. This includes the P3- AT’s controller and accessory components. Our research team would have been in denial without the ARIA software because it is the ideal platform for integration of our own modified robot-control software. ARIA skillfully operates the lowest-level elements of client-server communications, serial communications, command and server-information packet actions, and cycle timing. It also handles multithreading including the support of many accessories and controls, such as examining laser-range finder and sonar. Another type of software development, ARNetworking, provides the critical connection for TCP/IP - based communications with the robot’s platform through the network. The great thing about ARIA and ARNetworking is that they come complete with fully documented C++, Java, and Python libraries and source code. Therefore, our team has the biggest advantage toward implementing the OpenCV library with our own modified programs.

IV. Connecting to the Robot

A. Logging Into Linux

Now that we have finished with all of the history and development of the P3-AT robot, we will introduce the fun part of our project, playing with it. The first thing you would have to do in order to

play with the P3-AT is of course, turn on the power. Before you do this, however, make sure that the robot has enough battery power to run on; if not, take time to charge the battery. Listed below is a step by step procedure for logging into the robot’s Linux server.

Internet Protocol ( TCP / IP )

1) Use the IP address, 192.168.1.xxx (any number between 1-255), on your computer.

NOTE: Do not use the P3-AT’s IP address which is 192.168.1.23

2) Now, use a point-to-point server (such as Telnet or SSH) to connect to the robot’s server. Brandon and I connected to the robot using a linksys wireless router and the SSH server. The HOST name or robot’s IP address is: 192.168.1.23.

3) Now, all you have to do is login using [guest] as the username. See the example listed below:

[pic]

4) That’s it! You’re now inside the P3-AT’s Linux operating system.

B. Using the VNC server

The VNC server is a server that serves as a GUI (graphical user interface). Brandon and I downloaded it from the internet in order to use it for viewing images from the camera located on the robot. If you plan to view any type of videos or images on the screen inside Linux, then you will have to download and run the VNC server in order to do so. You can use this server in just four simple steps listed below:

1) First, make sure you are logged into the Linux system located on- board on the P3-AT robot. If you forgot how to do this, refer back to the “Logging Into Linux” steps above.

2) Once you are inside the Linux operating system of the P3-AT robot, simply type in [vncserver] at the command prompt with no space between “vnc” and “server”. Wait for a response from Linux that will tell you which port number is free to use. An example is listed in figure 2-1 listed below:

Figure 2-1:

[pic]

1) Now, open up the VNC server you recently or have already downloaded. In the “Server: ” dialog box, type in the robot’s IP address along with a colon and the port number you received. From figure 2-1, you should have typed something like this (Example: 192.168.1.23:2) with the number “2” after the colon being the port number Linux gave you above.

[pic]

Then, click the [ok] icon or simply press [ENTER].

2) You will then be prompted to type in the password, which is 123456. After you type the password in, press [ENTER]. That’s it ! You can now use the VNC server.

V. The “CompVision” Program

A. Overview

As part of our team’s research goal, we were trying to accomplish getting the P3-AT robot to detect objects, human faces in particular. In addition, we also wanted the robot to follow its object if it moved out of range. There were two additional programs that we found under the Aria and acts directories of the robot’s Linux server that helped our team view the objects in a display window and operate the camera located onboard the P3-AT. The acts program, found in the acts directory, displays a viewing window and controls the camera, while the CamShift program found under the Aria directory gives rise to the world of object-tracking. In order to complete our goal, we incorporated both of these programs into one and gave it the name CompVision. Since it was developed on the VNC server for object viewing, make sure you run this particular program via this server; otherwise, you will not be able to view images from the camera. Instructions on how to run CompVision are listed below in the next section.

B. Running CompVision

Inside any VNC server terminal, type in:

cd /usr/local/Aria/examples

to get into the examples directory where the CompVision program is located. Once you are inside this directory, simply type in CompVision at the command prompt with no spaces between [Comp] and [Vision]. Now you will need to click your mouse button twice so that you could place the Histogram box and the CamShiftDemo viewing window somewhere for object-tracking. An example is given below:

[pic]

C. Resolution Troubleshooting

As you can see, the CompVision program was created under the /usr/local/Aria/examples directory. When running this program, you will notice that a viewing window and a histogram box is also used

for the object-tracking feature purposes. To clear up the corrupted resolution of the CamShiftDemo window, all you have to do is run the acts program in another X-terminal using the VNC server, of

course. You can do this by typing in the following sentence at the command prompt:

cd /usr/local/acts/bin

Once you are inside the bin directory, simply type in [acts] at the command prompt. Like the CompVision program, you will need to click your mouse button twice to place two boxes somewhere when running the acts program. An example is shown below:

[pic]

Once the acts program is running, simply quit it by clicking

File -> quit -> yes

Now you can return back to the terminal where CompVision is running, and you should see a clear picture of whatever the camera is looking at in the bigger CamShiftDemo display window. See figure 4-3 below:

Figure 4-3:

[pic]

D. Using CompVision

As mentioned earlier, the CamShiftDemo image viewing window and the Histogram box are used together as a unit for the object tracking feature. Using the CamShiftDemo box, select the desired

object you want to track by click and holding the mouse to make a rectangle around it. Once the mouse is released, the CamShiftDemo window will continue to track the selected object by placing a red circle around it. The Histogram box displays all of the color variations of the selected object inside the CamShiftDemo window. In order to control movement of the camera, first click anywhere

inside the terminal where CompVision is running. Now you can operate the camera using the keys under the “Commands:” dialog. See figure 4-3 above for reference. To quit the CompVision program at any time, simply press the [Esc] key inside the same terminal where the camera control key box is listed.

VI. Packages

A. SickLogger

The sickLogger package makes use of the laser range finder on the robot. Using the information gathered from the laser, a 2D map can now be created. The data is saved as a .2d file and can later be

converted to a .map file (type used in Mapper3 and MobileEyes).

To run the sickLogger program, type in cd/usr/local/Arnl/examples which will take you to the directory the program is in. Type in sickLogger and the program should run. Once the laser is available, you can drive the robot around an area using the arrow keys (similar to teleop mode in the demo program). When you have driven around the area you wish to map, press escape to close the program. The program should tell you the filename of where the map has been saved (Logging to file…). A picture is shown below:

Figure 3-1: Running sickLogger

[pic]

B. Mapper 3

Mapper3 is a program that allows you to create and edit maps that can be used in the other programs for the ARIA software. You can also load .2d files from the sickLogger program and convert them into .map files.

To run Mapper3 you must first be able to bring up the GUI of the Linux server located on the robot. Instructions on how to do this are in the Connecting to the Robot section of the outline. Once that is done, you need to type in Mapper3 on the command line. To convert a .2d map to a .map file and see the actual map, open the .2d map file in the program. The Scan Tools toolbar should appear. Save your map as a .map file by clicking on the Save As icon. Make sure the file extension is .map. The map is now available for editing. This is shown in figure 3-2a listed below:

Figure 3-2a: Converting a .2d file to a .map file

[pic]

C. MobileEyes

The MobileEyes program is a command center of sorts. You can create maps, control one or more robots, autonomously control you robot, and many other functionalities.

Starting MobileEyes

To start MobileEyes, you must first be able to bring up the GUI of the Linux server on the robot. Instructions on how to do this are in the Connecting tothe Robot section of the outline. After this, you must type in cd/usr/local/Arnl/examples and then type in guiServer. This will run a demonstration server which is needed to give control of the robot to MobileEyes. Now, you must type in MobileEyes on the command line. This will take you to the startup screen shown in figure 3-3a.

Figure 3-3a.

[pic]

For our login, we just had to click connect because we had no user name and password for our robot.

Creating a Map

To create a map, go to Tools -> MapCreation -> Start Scan. Now just drive around the area you want to

map. After this, go to Tools -> MapCreation -> Stop Scan. Now, if you go to the Arnl examples

directory (cd/usr/local/Arnl/examples), you will find your map as a .2d file. MobileEyes can use both

.2d files and .map files. Now you can open your map in MobileEyes by going to Tools ->

RobotConfiguration. Under Sections, choose files; under parameters, there should be a map parameter.

Set the value to the file that you would like to use.

Using the Keyboard to operate your robot

Under the Robot menu, select Arrow Key Drive.

Touring Goals

Touring goals will make the robot visit each goal that has been set up in your map. Press the Tour Goals

icon on the toolbar.

Directing Your Robot To A Point

To direct your robot to a point, press the Send Robot icon on the toolbar and pick a point on the map.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download