Proposal:



Networked UAV C3

Final Report

Stage 1

Version 1.0

30 June 2005

Prepared Under Subcontract 04-0382 with L-3 Communications, ComCept Division, Contract Data Requirements List (CDRL) item A003, Final Report

Principal Investigator: Co-Principal Investigator

Timothy X Brown, Brian M. Argrow

Interdisciplinary Telecommunications Director RECUV

Electrical and Computer Engineering Aerospace Engineering Sciences

University of Colorado, Boulder University of Colorado, Boulder

80309-0530 80309-0429

Phone: 303-492-1630 Phone: 303-492-7881

Fax: 303-492-1112 Fax : 303-492-5312

timxb@colorado.edu brian.argrow@colorado.edu

Sponsor Contact:

Kenneth Davey

L-3 Communications Corporation

ComCept Division

2800 Discovery Blvd.

Rockwall, Texas 75032

Phone: 972-772-7501

Fax: 972-772-7510

ken.davey@L-

Networked UAV C3 Stage 1 Final Report

Executive Summary

The networked unmanned aerial vehicle (UAV) command, control, and communication (C3) program is the culmination of two major initiatives. The first was to establish a full-scale wireless test bed for assessing commercial-off-the-shelf (COTS) wireless-based communication networks made up of airborne and terrestrial nodes and components. The second major initiative was to harness this networking capability to enable groups of UAVs to complete high-level mission tasks cooperatively and autonomously. The first initiative was completed under a previous Wireless Communication Test Bed Program at the University of Colorado. The second initiative has been started and Stage 1 of the Networked UAV C3 program reported on here represents the first step.

Stage 1 focused on a single UAV and developed the UAV hardware and testing infrastructure for autonomous cooperative groups of UAVs. It integrated UAV autonomous flight control (AFC), network communication, and sensor subsystems with an onboard communication bus and a supervisory computer. This enabled software algorithms to control the operation of the plane based on flight, sensor, communication, and mission status intelligently. In parallel, a UAV remote monitoring and control software was incorporated into the design. A hardware-in-the-loop (HIL) laboratory test bed was constructed so that hardware, software, and interoperability elements could be rapidly developed and tested. The final system was flown with monitoring and experimental data collected and stored.

The experimental results show that the UAV is able to be remotely commanded with mission tasks that are interpreted on-board into specific actions. The plane flew flight patterns that depended on sensor measurements and communication status. Post processing of the sensor data revealed that a small near-ground UAV can collect unique and interesting data.

Future stages of the Networked UAV C3 will build on the Stage 1 capabilities to develop single-UAV mission intelligence to solve useful tasks robustly based on high-level operator commands. This will be extended to multiple UAVs and mechanisms for multiple UAVs to share flight, sensor, communications and tasking data will be developed.

Introduction

The ComCept Division, L-3 Communications Corporation (ComCept) has engaged the University of Colorado (CU) to design, install and operate a wireless communications test bed and to integrate and operate Unmanned Aerial Vehicles (UAVs) which will interact with it. The next step in the Wireless Communication Network effort is a project to integrate and deliver UAV control systems and intelligent software algorithms for tying sensor measurements, network intelligence (i.e. RF signal level, data throughput, etc.) and mission-level tasking information into automatic flight controls for groups of UAVs. The goal is to deliver advanced Communication, Command, and Control (C3) for networks of small UAVs. The project has been divided into stages. This is the final report on the first stage describing CU’s effort to develop the necessary UAV platform capabilities and results from platform testing.

Scope

Control systems that allow for autonomous operation will be integrated into existing UAVs. Software algorithms for tying sensor measurements, network intelligence (i.e. RF signal level, data throughput, etc.) and mission-level tasking information into automatic flight controls will refine UAV control systems and software, advance the UAV C3 technology and provide a hardware base for a multi-plane network for missionizing group-UAV applications. Within this larger context, this effort focuses on small UAVs (~10 kg) because they are small enough to allow cost effective development and testing of multi-plane control algorithms while large enough to carry interesting sensor payloads.

The effort covers the integration of communication, sensing, and navigation subsystems to enable coordinated action of these subsystems on a single UAV. These components were tested in both laboratory and field settings. A hardware-in-the-loop test lab was developed which enabled rapid and continuous testing during the integration effort. These were augmented by field flights that culminated in a complete system test.

This report describes the final system design, the component subsystems that make up the design, the software used for tying network intelligence and mission tasking into autonomous flight controls, and the results of component and system testing. It covers activities between November 1, 2004 and June 30, 2005.

Technical Approach and Detailed Activities

This activity implemented autonomous flight control, communication, and sensor sub-systems for the UAV and developed software to allow these subsystems to be monitored and controlled in-flight. The next section is a technical description of the UAV system and the system tests. Of particular note is a hardware-in-the-loop test capability that enabled emulated field tests to be performed in the lab for rapid system development.

1 Technical Approach

1 UAV C3 System

The UAV platform for this project is the Ares UAV designed and built at the University of Colorado. In order for a UAV to have autonomous flight abilities, controlled by mission tasks and sensor involvement, as well as cooperative multi-UAV capabilities, additional functionality is needed. This functionality is composed of an Autonomous Flight Control (AFC), Communication Networking, and Sensor modules. The AFC operates the UAV control surfaces, measures position and orientation information, and guides the plane to reach waypoint or flight pattern goals. The networking communicates with other UAVs and ground nodes to route traffic to or from the UAV. It also acts as a relay node in an ad hoc (aka mesh) network formed with other network nodes. The sensors might be cameras or temperature, chemical, or other probes. The main tasks were to develop the hardware for each subsystem and to define interfaces between the subsystems.

The UAV system integrates custom and COTS components developed in previous efforts with components previously developed for a number of UAV projects sponsored by the Department of Aerospace Engineering Sciences (AES).

Figure 1 is a schematic of the proposed modular UAV system architecture. The subsystems are connected via a CAN bus interfaced through Naiad nodes. The Naiad is a small interface board with the capability to perform as an intelligent interpreter between the various subsystems. These distributed nodes perform tasks ranging from simple analog to digital conversion to running control algorithms that issue steering commands to the Piccolo autopilot based on communications and sensor inputs. The Thalassa sensor package was developed for the UAV-C3 experiments. Thalassa is a Naiad board with integrated, miniature COTS temperature, humidity, and pressure sensors. These sensors are used to demonstrate the distributed capability indicated in Fig. 1. The following sections are a detailed discussion of the subsystem components.

1 Naiad Interface Nodes and Thalassa Sensor Board

The Naiad[1] was originally developed as the core of several avionics packages for UAV projects sponsored by AES. The current system features distributed computing based upon the Atmel ATmega128 microcontroller (see Appendix A for detailed specifications). These full-featured microcontrollers are used to create the interface boards that are interconnected through the fault-tolerant, high-speed Controller Area Network (CAN) serial bus. Each interface board supports a single subsystem (sensor module, communications, etc.) and serves as an intelligent interpreter between COTS components and the AFC system. The Naiad system allows the interface board to handle all low-level tasks such as polling instrumentation, and allows the bus protocol to remain high-level. In addition, the central flight computer can be responsible for system management through connections to the interface boards via the CAN bus.

The CAN bus supports data exchange rates up to 1 Mbps. Unlike other protocols where each node reads messages to its address only, CAN allows a node to subscribe to and broadcast message “types.” For example, a temperature measurement might be broadcast onto the bus as a type ‘temp’ and a pressure measurement might be a type ‘press’. If the central computer node is subscribed to these message types, it will accept a temp or press message without the need for the sender node to specify the central computer’s address. This enables several nodes to broadcast a certain message and to be switched in and out of the system without having to do any reprogramming of the other interface nodes.

[pic]Thalassa is an expanded Naiad node with miniature temperature, pressure, and humidity sensors integrated onto the circuit board. The Naiad was used as a base system to allow for communications across the CAN bus, sensor data acquisition, and the use of the Naiad firmware. This resulted in a significant savings in both development time and cost. The Thalassa board serves as an interface for all three sensors.

Figure 2(a) is a photograph of two Naiad nodes to show the board population and the connector layout. Naiad and Thalassa nodes are shown with a battery pack in Fig. 2(b). This is the Ares-2 mounting configuration for the flight experiment.

Figure 3 is a block diagram of the Naiad and Thalassa circuit boards. The diagram emphasizes that circuitry is the same for both boards with the addition of the Thalassa sensors enclosed in the green box. Note that the sensors include a single-axis magnetometer that was not used in the present experiment.

[pic]

2 Supervisory Computer

The Supervisory Computer module acts as a central computer to the distributed system, running high-level mission control algorithms. The purpose of this system is to collect all of the data and telemetry from the other subsystems and use this information, along with mission goals and limits, to monitor system health, turn sensors on and off, and to provide sensor reactive control commands to the autopilot system. These sensor reactive control algorithms are also responsible for tying network intelligence and mission tasking into autonomous flight control commands.

For this project, the required processing to execute the experiment test plan was minimal and so the supervisory computer algorithms were run directly on the Naiad interface node, reducing the ACF to a single component. If more processing is required in the future the Naiad can be interfaced with other computers as demonstrated in the Communication Module, or be extended with more Naiads running specialized tasks within the Computer Module.

3 PiccoloPlus Autopilot

The PiccoloPlus is the second generation of the Piccolo autopilot system developed by Cloud Cap Technology. It is a complete integrated avionics system including the core autopilot, flight sensors, navigation, wireless point-to-point communication, and payload interface capabilities all in a small and inexpensive package. The Piccolo provides three modes of UAV operation: manual pilot control, GPS waypoint navigation, and steering control. Manual pilot control is used for takeoffs and landings of the Ares aircraft while GPS waypoint navigation is the primary operation mode. In this mode, the autopilot system follows a series of GPS waypoints while holding altitude and airspeed. In steering control mode, the autopilot system maintains altitude and airspeed while following an externally provided turn (heading) rate, enabling the aircraft to be steered.

The photograph in Fig. 4 illustrates the small size of the portion of the Piccolo system that is mounted inside the UAV. The block diagram shows the system components and interface connections. The point-to-point communication link uses the MHX-910/2400 frequency hopping radio from Microhard Systems Inc, which has max output of 1-W in the 900 MHz ISM band and provides the primary communication link between the ground station and the Piccolo. The built-in inertial measuring unit (IMU) in the lower right is the major addition from the first-generation system to the PiccoloPlus.

[pic]

The complete PiccoloPlus system includes a ground station connected to a laptop PC and a manual R/C-style pilot console. With the exception of the IMU, the ground station contains exactly the same avionics package mounted in the UAV. The PC provides the platform for the Piccolo graphical operator interface (OI) that allows the user to track the UAV GPS location, to monitor its performance, and to upload GPS waypoints or enter steering commands.

All commands that are exchanged between the ground station and the Piccolo, including steering commands, can also be uploaded through an external RS232 serial interface on the Piccolo. The Naiad in the AFC uses this interface to communicate with the Piccolo to get the health and status information, and to issue steering or GPS waypoint commands to the Piccolo. The Piccolo is capable of duplex operation so that commands from an onboard system can be intermixed with commands received over the 900 MHz link from a remote operator. Thus, control of the Piccolo by a remote operator and onboard systems are not mutually exclusive modes.

4 Communication Interface

The Communication Interface uses the Mesh Network Radio (MNR) developed as part of a prior sponsor contract. The MNR, discussed in detail in Appendix B, consists of an Orinoco Gold IEEE 802.11b PCMCIA card, a Soekris 4511 single board computer, a Fidelity-Comtech 1-W bidirectional amplifier, and a Garmin GPS. The radio runs the Dynamic Source Routing (DSR) ad hoc routing protocol that enables the aircraft to communicate with other aircraft and with similar ground nodes. The DSR protocols developed under this project are available as an open source download at pecolab.colorado.edu.

The Comm Module interface Naiad communicates with the MNR over a RS232 serial port running at 57600 baud. A packet communication scheme, that had been developed in previous work with the VirtualCockpit (discussed below), is used to exchange data objects over the serial link and ensure its integrity with the use of a 16-bit checksum.

5 Sensors

The Naiad interface system allows for a heterogeneous suite of sensor modules, independent of the sensor interface. This capability enables the notion of “sensor” to include traditional temperature sensors as well as the use of the MNR as a sensor, measuring network performance metrics such as network connectivity, link throughput and link signal strength. For the proposed experiments, the MNR metric of interest is the connectivity with the VirtualCockpit.

To emulate a third-party scientific payload, the Thalassa sensor board discussed earlier, was developed to measure temperature, pressure and humidity. Temperature and humidity are measured on a single chip manufactured by Sensirion AG[2], part number SHT15. The manufacturer reports a temperature resolution of 0.01 (C with accuracy of ±0.3 (C at 25 (C. The relative humidity is resolved at 0.03% with accuracy of ±0.3%. Absolute pressure is measured with the Intersema[3] MS5534B Barometer Module. This module measures pressure in the 10-1100 mbar absolute pressure range, with a reported resolution of 0.1 mbar and an accuracy of ±1.5 mbar at 25 (C.

2 Remote Monitoring Station

To enable UAV C3 over the 802.11 wireless network, a remote monitoring station (RMS) was developed. The RMS consists of a Pentium level laptop and an Orinoco Gold IEEE 802.11b PCMCIA card. The laptop runs the Linux operating system and uses the same DSR routing algorithm implemented on the MNRs to enable network communication. The laptop also runs the VirtualCockpit program, which provides the networked C3 UAV interface for a remote operator. In addition, the VirtualCockpit is responsible for logging the communication packets between the RMS and UAV to enable flight replay and post processing of the sensor data.

Communication between the RMS and the MNR on the UAV is enabled with the use of network sockets. Network sockets as described by the Berkeley socket API is the most common interface to access network services, particularly TCP/IP channels, for the C and C++ languages on most operating systems.

1 Virtual Cockpit

The VirtualCockpit (VC) was originally developed to support a remotely-piloted UAV project sponsored by AES and the National Severe Storms Laboratory[4]. It was designed to provide the pilot with a Virtual Field Environment of the aircraft using a 3D view of the aircraft, a heads-up-display with heading and altitude, digital elevation maps, and overhead projections for the location of the aircraft. A screen shot of the original VirtualCockpit during a replay of flight data from an Aerosonde UAV is shown in Fig. 5.

The VC is implemented in C++ using the OpenGL and GTK+ libraries. OpenGL and GTK+ are multi-platform toolkits for creating graphical user interfaces. Though the VC was primarily designed to run in a UNIX based operating system, use of these libraries enables the VC to run in Windows as well while providing the same look and functionality to the operator.

For this work, the VC was reconfigured to provide the networked command, control and communications to the UAV and provide the operator with situational awareness. The functional interface was chosen to follow, in general, the same scheme as provided by the Piccolo operator interface (OI). In most cases, the VC reproduces the same functional interface to the UAV as the OI. The main exception is manual piloting, which must go through the Piccolo ground station. Thus, the goal of the RMS is not to replace the Piccolo OI, but to reproduce and extend the capabilities over a network interface.

This work is the first step in the development of a user interface for a UAV swarm by providing command and control, from a single operator, of multiple UAVs over a wireless meshed network interface. Specifically, the VC utilizes network socket communication over an 802.11 meshed network to interface with a UAV for command and control while providing a custom graphical display of UAV, network, and sensor data of interest. Screen shots are shown in Fig. 6.

[pic]

On the left of Fig. 6 is the main VC window, which provides the high-level situational awareness to the operator through an aerial perspective displaying ground station location (blue dot), GPS waypoint plans for a UAV (green line and black dots), and UAV location (red triangle) on a georeferenced aerial photograph. In addition to displaying GPS latitude and longitude of network nodes, the symbols are augmented with additional information to the upper left of the symbol: the node ID, altitude, and the next UAV waypoint. Note that the waypoint plans shown in Fig. 6 are those that were used during the experimental testing. The top waypoint plan (or flight plan) contains waypoints 0 through 7 to form a clockwise orbit about the runway location. The southern flight plan uses waypoints 10 through 15 to form a counterclockwise orbit to the south of the runway. For future reference, the northern flight plan is referred to as flight plan 1 for experimental testing and the southern is flight plan 2.

The green box on the bottom left in the main window indicates a good connection to a UAV node with an IP address of 127.0.0.1. The two windows on the right are two panes provided to the operator for each UAV that is connected on the network. The top pane provides Piccolo health and status, as would be seen on the Piccolo OI, in addition to displaying statistics from ping commands between the ground station and the UAV. The bottom pane provides scrolling strip charts for the display of the Thalasa sensor data (temperature, humidity, and pressure) as well as displaying communication performance statistics measured on board the UAV for the network socket interface to the ground station and the serial interface to the Naiad communications node.

Though only a UAV is shown connected in the screen shots in simulation, the VC has been developed to connect to and show the location of MNR nodes within the network as well. When an MNR is connected, an orange square is drawn on the display.

2 Network Socket Communications

Communication over the 802.11 network between the UAV and the RMS is enabled through the use of network sockets. A socket is a software endpoint that establishes bidirectional communication between any two programs running within the network. The socket associates a server-side program with a specific hardware port on the machine where it runs so any client-side program within the network, with a socket associated with that same port, can communicate with the server program.

Sockets that communicate via a reliable channel, i.e. when using a TCP link, have a dedicated point-to-point channel between themselves. To communicate, they establish a connection, transmit the data, and then close the connection. All data sent over the channel is received in the same order in which it was sent. This is guaranteed by the TCP channel. In contrast, applications that communicate via datagrams, such as UDP, send and receive completely independent packets of information. These clients and servers do not have and do not need a dedicated point-to-point channel. The delivery of datagrams to their destinations is not guaranteed. Nor is the order of their arrival. In real-time applications where data is continually down linked at a known rate, it is believed that it is better for the network to route the most recent data packets, as opposed to being bogged down with getting all packets through the network. Thus, the UDP protocol was chosen for the experimental results presented below.

A simple command byte structure was implemented to exchange data between the VirtualCockpit and the UAV over the socket API. Because of its simplicity and utility, the same packet protocol scheme is used over the RS232 serial connection between the MNR and the Naiad node in the Comm Module. The protocol structure is shown in Fig. 7. The ID field represents which command and or data set is contained within the message. Since it is known whether a system is receiving or transmitting via a socket interface, command IDs represent the general command and not directionality. The NUM field is the number of bytes contained within the actual parameters, which are held in the BYTES[] field. This is an array of unsigned bytes used to carry the data of the command, or data structure. The checksum field utilizes the TCP checksum calculation, which is a 16-bit ones complement sum of the data. The purpose of the checksum is to aid in maintaining byte alignment from incoming communication streams and ensure data correctness.

2 The Ares UAV

Figure 8 is a three-view and isometric view of the basic Ares UAV. The airframe is based on the layout of the Senior Telemaster, a popular RC model. Custom modifications include an expanded fuselage to accommodate a payload section and conversion from a tail-dragger configuration to one with tricycle gear. The monocoque fuselage, wings, main gear, and horizontal tail are made of a carbon composite laid over carbon-composite laminated plywood bulkheads and ribs. Because it covers a 900-MHz antenna, the vertical tail is constructed of fiberglass. The Ares is powered by a 5-hp, two-stroke engine.

Including the prototype Ares-1, the Ares fleet includes four vehicles numbered 1 through 4. Ares-2 was used for the experiments described in this report. Table 1 summarizes the performance of the Ares UAV and Fig. 9 shows, from left to right, Ares-1, Ares-2, and Ares-3 with pilots prepared to fly before a series of experiments in fall 2004.

[pic]

[pic]

3 System Testing

The goal of the system testing was to evaluate subsystem integration of the more than one-dozen required components distributed between the UAV, ground nodes, ground stations, and network. The testing consisted of laboratory hardware-in-the-loop (HIL) simulation and initial flights for system shakeout. Following completion of system testing, specific demonstration experiments were performed as described in the Test Plan section.

[pic]The laboratory HIL simulation architecture is shown in Fig. 10(a). There are two modes for the field experiment. Figure 10(b) is the mode during takeoff where manual commands are uplinked through the 900-MHz groundstation link, directly to the Piccolo. When control is switched to the Piccolo autopilot, for the autonomous portion of the experiment, the groundstation with its 900-MHz link move to a purely backup role and are not actively involved in the experiment, as shown in Fig. 10(c). Once the autonomous experiment is completed, the system is returned to the mode shown in Fig. 10(b) for a manually-controlled landing. A key point from Figure 10 is that the laboratory setup for HIL simulation uses all of the physical components that are deployed in the field tests.

The three common systems in the architectures presented in Fig. 10 consist of a Remote Monitor Station (RMS), Ares UAV, and a Piccolo Ground Station (GS). The Piccolo GS provides the primary pilot control and interface to the Piccolo autopilot while the RMS provides the network interface for networked C3. The following sections present the details of the testing environments and the final field test plan.

1 Hardware-in-the-Loop Testing

Development and testing were facilitated by hardware-in-the-loop testing in a laboratory environment as shown in Fig. 11. The lab test bed uses all of the hardware and software components in the full system, except for the actual sensor measurements of the Piccolo unit. Instead, modified GPS, inertial, and air-data measurements are given as inputs to the navigation subsystem from flight simulator software provide by Cloud Cap Technologies. The navigation subsystem outputs the control surface commands to the flight computer.

The result is that the UAV “thinks” it is flying while sitting on the lab bench, including deflection of the aerodynamic control surfaces and actuating the throttle. In this way, system interactions can be tested under simulated flight conditions, and the behavior of the UAV as it switches from different regimes can be tested and debugged prior to an actual flight test. This enabled many software bugs and interoperability problems to be efficiently captured in a rapid-prototyping environment.

The laboratory environment is never completely realistic. During flights, links can be weak or vary while the UAV maneuvers. These conditions are partially simulated by moving nodes around the lab building or shielding the radio antennas. During flights, the UAV airframe and payloads are subject to engine vibration and other dynamic mechanical forces that are not simulated in the lab. The flight simulator has a vehicle model based on the Ares UAV but the model’s flight characteristics can differ from what is observed during actual flights. The sensors measure conditions in the lab that would not be observed in the air. Despite these deficiencies, the lab setup proved invaluable at completing the development and testing on time.

2 Test Plan

The tests were performed at the Table Mountain National Radio Quiet Zone north of Boulder, CO. The layout of the airfield, flight plans, network operation center, mesh network radio nodes, and virtual cockpit are shown in Fig. 12. The test plan consisted of two experiments to test the ability of the aircraft to react to changes in sensor measurements or communication status. The temperature sensor was monitored for simulated icing conditions in the first experiment while connectivity between the UAV and a specified ground node was monitored in the second experiment. Icing conditions or sustained loss of connectivity would trigger a change in flight plan.

[pic]

A basic experiment consists of the following steps.

1. The Ares UAV is manually piloted for takeoff and transitioned into flight plan 1.

2. Ares is commanded into autonomous mode and flies flight plan 1, which is preloaded.

3. A new flight plan is entered over the communication link followed by a “start experiment” command.

4. Ares transitions into flight plan 2 where it sends a sensor report every second consisting of temperature, pressure, and humidity data.

5. Ares maintains flight plan 2 until one of the following conditions is met.

a. The temperature probe records a temperature below 40 degrees Fahrenheit (potential icing).

b. The communication link between the RMS and UAV has been down for more than 40 sec.

6. Ares transitions back into flight plan 1.

7. If another experiment is desired in the same flight, step 3 is repeated, otherwise Ares is manually landed.

The two experiments based on this basic design were a virtual icing experiment and a communication reactive flight experiment, each designed to trigger the conditions in step 5. For the virtual icing experiment a temperature offset was subtracted from the temperature sensor reading. The offset starts at zero and decreases over time to simulate a dropping temperature. In the communication reactive flight experiment, the constant pinging between the UAV and the ground station, that is used to measure network performance, is manually stopped to simulate a communication loss. By simply stopping the pings, which act as a “heartbeat” to both systems for link connectivity, sensor and UAV data can still be down linked to the RMS.

The significance of the steps in these experiments is represented in the following diagram. The arrows indicate which subsystem is influencing which other subsystem in each step. In the diagram, we see that every influence relationship (except from the AFC to the sensor) is tested.

|Step 1,2: Ares is launched and flies autonomously in preloaded |[pic] |

|flight plan 1. | |

|Step 3: A new flight plan is entered over the communication link |[pic] |

|along with a “start experiment” command which turns on the | |

|sensors | |

|Step 4: Ares transitions into flight plan 2 where it sends a |[pic] |

|sensor report every second. | |

|Step 5a: Ares maintains flight plan 2 until the temperature probe|[pic] |

|records a temperature below 40 degrees at which time the plane | |

|returns to flight plan 1 and sends a freeze warning. | |

|Step 5b: Ares maintains flight plan 2 until the communication to |[pic] |

|the RMS is down for more than 40 sec. at which time the plane | |

|returns to flight plan 1 and resets the sensor. | |

Test Results

Experiment results were obtained in a 50 min flight of Ares-2, being under full autonomous control for more than 30 min. The first 15 minutes of flight were used to go through a series of test cards for system checkout and verification of the Piccolo autopilot system. After successfully passing all of the test cards, interaction of the VirtualCockpit and Ares-2 was tested by manually commanding waypoint target changes to cause the UAV to switch from flight plan 1 to plan 2. Finally, two separate experiments were run to test the two primary triggers laid out in the test plan: temperature and communication link.

During the experiment, the MNR on the UAV continually down linked packets to the RMS which included the Thalassa sensor measurements, GPS position from the Piccolo, and waypoint commands generated by the AFC Naiad to name a few. Thalassa sensor measurements were down linked over the 802.11 DSR network to the RMS at 2 Hz and logged by the VirtualCockpit while the Piccolo specific data packets where down linked at 1 Hz.

1 Experiment Results

The primary results of the flight experiment are shown in Fig 13. In the top of the figure is the waypoint number the UAV was tracking at that moment. The middle plot shows the temperature measured on board the UAV (including offset), and the bottom plot shows the ping times recorded by the ground station. The horizontal time axis is referenced in mission time, which started before takeoff when the UAV is powered and first connects to the RMS. These three plots together provide the results of executing the experimental test plan described above.

In the top plot, it can be seen that at the start experiment commands (there are two shown in the time scale covered), the UAV transitions from flight plan 1 (waypoints 2–7) to flight plan 2 (waypoints 10–15). Upon detecting a sensor trigger, as defined in the test plan, the UAV transitions from flight plan 2 back to flight plan 1. Refer to Fig. 6 for the exact flight plan layout and waypoint numbering used in the experiement.

The first trigger in the experiment flight was to correspond to the temperature probe reaching 40 degrees. This is seen in the temperature plot (middle) with the first start experiment command sent at roughly 58 minutes. After the start command is given, the Thalassa node introduces an artificial offset to represent dropping temperature. It is seen that right before the 60th minute that the temperature reaches 40 deg F, causing the sensor trigger. It is seen in the top plot that the waypoint being tracked jumps from waypoint 14 to waypoint 2.

The second sensor experiment was started after the 64th minute, and represents the ability of the UAV to respond to a change in network performance. For this experiment, connectivity between the UAV and the RMS was chosen to be the primary trigger and was measured by sending ping commands between the two. The loss of ping commands is seen in the bottom plot, with the trigger occurring after 40 seconds of no pings.

[pic]

2 UAV-RMS Network Communication

For the experiments, UDP socket connections were used to link the RMS to the UAV. Figure 14 shows the roundtrip ping times from the RMS to the UAV back to the RMS during system checkout on the ground and through the flight. The left figure shows them over time and the right is a histogram. The ping times are typically 120 msec, which includes the processing time on each end and transmission through one or more intermediate mesh network links.

[pic]

From previous flight experience, it was believed that the ping times would be dependent upon the attitude of the aircraft relative to the ground station. However, due to the close operation of the aircraft to the ground station (typically within 1 km), Fig. 15 shows that there is no noticeable dependency upon the ping time based on location (which relates to a turning attitude within the flight plan). The dense cluster of ping times at the center of the plots represents the data collected while the UAV was on the ground during preflight checkout. It is expected though that as the operational range is increased, the ping times will be affected by the attitude of the aircraft. In addition, when a multi-hop link is used, the ping time will again be dependent upon location of the aircraft since location will be the primary factor in determining network routes.

3 Sensor Measurements

Figure 16 shows plots of the relative humidity, pressure, and temperature over time that were logged by the VirtualCockpit for the entire experiment. Between the 0th and 36th minute of mission time, the change in the recorded sensor measurements is primarily due to the warming atmosphere (from the 11:00 am to 12:00 pm hours) and transitions of the UAV into and out of shade during ground setup and checkout.

The dramatic change that occurs at the 36th minute represents the takeoff phase of the experiment and shows the UAV cooling down to atmospheric temperature after sitting in the sun. The drop in atmospheric pressure, shown in the middle plot, is due to the location of the pressure sensor within the UAV and is related to the airspeed of the UAV.

Figure 17 displays an interesting (and unexpected) result that the humidity sensor was able to detect a 2% change in relative humidity along the UAV flight plan encircling the runway. The positional dependency of humidity over a four-minute period, starting at 62 minutes mission time, is shown. The left plot shows the GPS location of the UAV colored by the measured relative humidity over an aerial image of Table Mountain. Black represents a relative humidity below 14% while red is for 16.5% and above. The colors change at 0.5% increments within these two bounds. The cause of the humidity change over the orbit is due to the local ground terrain beneath the UAV, while it is flying only 300 ft above the ground. The blue and green colors are above the mesa while the red and pink colors are off the mesa. Due to the age of the aerial image, the foliage that is present on the edge of the mesa, giving the higher humidity values, is not shown. In addition, a stream runs adjacent to Nelson Road (main diagonal road in the image) that contributed to the higher humidity that is not readily visible.

A contour plot was generated from this data by using a 2D interpolation technique known is Kriging and is shown in Fig. 18. The contour plot is shown to highlight the type of information that an atmospheric scientist would have interest in from the data collected by the UAV.

[pic]

Conclusion

Stage 1 of the Network UAV C3 project has been completed successfully. Having built upon earlier work on COTS wireless-based communication networks made up of airborne and terrestrial nodes and components, this stage is the first step in a program to demonstrate groups of UAVs that cooperatively and autonomously complete high-level mission tasks.

Stage 1 focused on a single UAV and developed the UAV hardware and testing infrastructure for autonomous cooperative groups of UAVs. It integrated UAV autonomous flight control (AFC), network communication, and sensor subsystems with an onboard communication bus and a supervisory computer. This enabled software algorithms to control the operation of the plane based on flight, sensor, communication, and mission status intelligently. The design is based on a distributed communication architecture that enables subsystems to be added or removed with minimal modification to other subsystems. For instance, additional temperature sensors could be added to the UAV and no new addressing or software would be needed.

In parallel, a UAV remote monitoring and control software was incorporated into the design. This enabled an operator to monitor the vehicle subsystems remotely in real time via a graphical user interface. The GUI enabled tasks and vehicle commands to be uploaded using simple intuitive checkbox, menu, and parameter field manipulation. The remote monitoring is a networking application built directly on top of the ad hoc network constructed under the previously completed Wireless Communication Test Bed program.

A hardware-in-the-loop laboratory test bed was constructed so hardware, software, and interoperability elements could be rapidly developed and tested. This capability was invaluable throughout the project and will be a key resource in future program stages.

The final system was flown at the Wireless Communication Test Bed to demonstrate the remote monitoring and control software and the ability for the plane to make autonomous decisions based on interactions between the different subsystems. The experimental results showed that the UAV was able to be remotely commanded with mission tasks that were interpreted on-board into specific actions. The plane flew flight patterns that depended on sensor measurements and communication status.

Post processing of the sensor data revealed that a small near-ground UAV can collect unique and interesting data. The UAV was able to detect the presence of a small stream not readily apparent from visual sources such as a satellite. Such capability is directly applicable to wildland fire fighting and suggests other applications such as chemical sensors on UAVs to provide precise locating and tracking of chemical plumes.

Future stages of the Networked UAV C3 programwill build on the Stage 1 capabilities to develop single-UAV mission intelligence to solve useful tasks robustly based on high-level operator commands. This will be extended to multiple UAVs and mechanisms for multiple UAVs to share flight, sensor, communications and tasking data will be developed.

A. Naiad Interface Node Specifications

The heart of the Naiad node is an 8 bit Atmel microcontroller with the following features:

• 8 10 bit a/d channels

• 2 TTL level UARTs

• I2C interface

• SPI interface

• External Interrupts

• Several GPIO lines (depends on other feature usage)

• ISP serial programming interface

• PWM channels

• 2 8 bit timer/counters

• 2 16 bit timer/counters

The Naiad node includes the following other features:

• TTL to RS232 signal conversion

• CAN bus interface with fault tolerant transceiver

• Watchdog timer and brown out detection

• 128Kbits of EEPROM storage

In addition to the HW features, the chip is also fully supported by a GCC tool chain. Software drivers have been written for each of the devices, and libraries are available for:

• Filtering data

• Buffering data

• Interface to many 3rd party sensors

- GPS Units

- IMU

- Pressure

- Temperature

- Humidity

- Magnetometer

• High level CAN protocol

• Timing device operations

CAN Bus

The CAN bus may be operated at many speeds and was used at 125Kbps for the experiment. Each CAN frame can be used to transport between 1 and 8 bytes of user data. Maximum bandwidth was tested and the results are shown in Fig. A.1 on the left. The maximum bandwidth in the lab is 7.5 kBps (60kbps). The communications used in the actual experiment were logged and the results of the bandwidth usage for the entire experiment can be seen in the histogram in Fig. A.1 on the right. As can be seen, only a tiny fraction (less than 1%) of the potential bandwidth was needed for the operation of the UAV and the experiments.

[pic]

B. Communication Infrastructure

The Mesh Network Radios (MNRs)

The wireless AUGNet end user nodes are referred to as MNRs. These MNRs pose special challenges. These ad hoc nodes must support a variety of configurations that include being placed inside small UAVs, driven around in vehicles, incorporated into handheld communication devices or simply placed at various fixed sites. Because the majority of end users are mobile, these MNR devices must be tough, small and power efficient. Yet, the MNR subsystems must use IEEE 802.11 radio components, COTS computers and electronics and Linux-based open source software.

The universal MNR consists of ad hoc network software, communication hardware, and monitoring software. The MNR hardware includes a Soekris Model 4511 single board computer, an Orinoco 802.11b card, a Fidelity-Comtech bidirectional amplifier with up to 1W output and a GPS receiver. An MNR is shown in Fig. B.1 without and with an enclosure. The integrated antenna enclosure is 21 cm (8.27 inches) across. The enclosed and environmentally protected MNRs are used in fixed site deployments and mounted on vehicles that drive through and around the test range. The non-enclosed MNRs are placed inside the UAVs.

[pic]

Figure B.1: A Mesh Network Radio (MNR) system without and with an enclosure.

The Soekris single board computer operates at 100 MHz. It has128 MB of RAM and 32 MB compact flash storage that can be expanded. The Soekris computer is driven by a small variant of the Linux operating system called Pebble, which gives the programmer great flexibility, full control of the computing environment, as well as compliance to open-source software development standards. The RF amplifier gain is adjustable from 100 mW to 1 W. The GPS receiver sampling rate is one sample per second. The MNR requires an external power supply. As described above, the environmental mother board enclosure and the integrated antenna is not used for UAVs.

The Ad Hoc Networking Software

Each MNR runs the DSR protocol[5] and communicates with other nodes via the 802.11b card. The DSR version used was designed and implemented in the CU Pervasive Communications Laboratory by the UCB faculty and graduate Research Assistants. It was specifically intended to be used by other research groups and is licensed under the open-source GNU licensing terms[6]. They used the Click modular router software[7]. Click provides a great deal of flexibility that allows the experimentation team to modify the protocols to monitor the network and equipment behavior.

The Click Program

The Click modular router software was selected as the routing software infrastructure to implement the routing protocol. The router software can be configured to run at the user level using a driver program or in the Linux kernel as a kernel module. When Click runs in the user level mode, it requires a kernel tap that captures packets that are destined to or from the kernel. This allows the packets to be manipulated in the user-space and also allows for the re-insertion of the packets into the kernel stack. When Click runs as a kernel module, it can steal packets from the network devices before Linux gets an opportunity to handle them; it sends packets directly to the devices as well as sending these packets to Linux for normal processing. The kernel module version of Click is used for the MNRs. This gives higher performance because the router runs as a part of the operating system kernel.

Click programming consists of defining an interconnected collection of reusable modules called elements. Every element controls a specific aspect of the router. They handle the functions of communicating with the network devices, packet modifications and packet scheduling. These elements are inter-connected in a router configuration file. The Click program reads the configuration file, and sets up the router accordingly.

A flow diagram of the communication software is shown in Fig. B.2. This software system is automated and self starting. There are no long-term states that can be corrupted during sudden loss of power because it uses a RAM file system, has a ROM personality and is robust.

Direct Source Routing (DSR) Implementation

The DSR protocol is defined in an Internet Engineering Task Force (IETF) Draft[8] and is comprehensive in stating the exact implementation details for the Click protocol. We have conformed to the draft while designing the Click DSR router elements.

[pic]

Figure B.2: Communication software of the Mesh Network Radio System.

The DSR is under the control of the MNR. This MNR then communicates with other MNRs through the embedded Orinoco 802.11b card. DSR was selected because it facilitates on-demand routing. On-demand routing means that the traffic source only seeks a route to a destination when it has data to transmit. Therefore, there is no wasted bandwidth while attempting to establish routes that will never be used. A route is initiated only when a node needs to send a packet. When a packet is ready to send a route request process then establishes a route through the network. The DSR also implements source routing which forces a packet to take a specific path through the network.

-----------------------

[1] Naiad and Thalassa are Neptune’s first and second moons, respectively.

[2] Sensirion AG, Eggbühlstr. 14, P.O. Box CH-8052, Zürich, Switzerland. .

[3] Intersema Sensoric SA, Ch. Chapons-des-Prés 11, CH-2022 Bevaix, Switzerland. .

[4] C. Eheim, C. Dixon, B. Argrow, and S. Palo, "TornadoChaser: A Remotely-Piloted UAV for In Situ Meteorological Measurements," presented at AIAA's 1st Technical Conference and Workshop on Unmanned Aerospace Vehicles, Systems, Technologies, and Operations, Portsmouth, VA, 2002.

[5] Johnson, D., Maltz, D., “Dynamic Source Routing in Ad HocWireless Networks,” Mobile Computing, Chapter 5, pp. 153-181, Kluwer Academic Publishers, 1996.

[6] A copy of the source code can be downloaded at

[7] E. Kohler, R. Morris, B. Chen, J. Jannotti, and M. F. Kaashoek, “The click modular router,” ACM Transactions on Computer Systems, vol. 18, no. 3, pp. 263–297, August 2000.

[8] Available from the IETF at .

-----------------------

Figure 1: UAV C3 Architecture

Computer Module

Naiad

Node

Supervisory Computer

Comm Module

Figure 12: Experimental set up at Table Mountain.

Sensor Modules

Autonomous Flight Control Module

Naiad

Node

Piccolo

Autopilot





Sensor 2

Figure 2: (a) Top and Bottom view of two Naiad boards. (b) Naiad and Thalassa boards mounted with battery power supply prior to installation.

Flight Plan 1

Battery

Sensor 1

Sensor N

CAN Bus

Comm

Interface

Naiad

Node

Flight Plan 2

Scale

Sensor

AFC

Computer

Comm

NOC

NOC

MNR 83

MNR 84

Naiad

Node

Naiad

Node

Naiad

Node

VC/RMS

Road

UAV airstrip

Fixed sites

Range

MAP KEY:

0 600 1200 meters

N

North Gate

West

Gate

Plateau Rd

Central Rd

Sensor

AFC

Computer

Comm

NOC

Sensor

AFC

Computer

Comm

NOC

Sensor

AFC

Computer

Comm

NOC

Sensor

AFC

Computer

Comm

NOC

[pic]

Figure 18: Contour plot of relative humidity overlaid on aerial image of Table Mountain.

Figure 6: VirtualCockpit screen shots.

Figure 4: PiccoloPlus autopilot system.

Figure 5: Original VirtualCockpit screen shot showing a replay of the Aerosonde UAV flight data.

Figure A.1: Naiad CAN communication bandwidth.

Figure 17: Relative humidity measurements collected over a 3 min period.

Figure 7: Socket command byte structure.

Figure 15: Round-trip ping times plotted against GPS locations.

Figure 14: Round-trip ping times.

Figure 13: Experiment Results

Ground Station

Ares-2

Simulator

Operator Interface

VirtualCockpit

FlightGear

Figure 11: Hardware-in-the-Loop testing in the lab.

Figure 16: Thalasa sensor measurements onboard Ares UAV (takeoff is at minute 36).

Thalassa

Naiad

(b)

(a)

Figure 10: System architectures for (a) hardware-in-the-loop simulation, (b) for takeoff during field test with manual control over 900-MHz link, and (c) autonomous flight during field test.

Table 1: Ares specifications and performance.

Figure 9: Ares UAVs with pilots.

Figure 8: Three-view and isometric view of the Ares UAV, dimensions are in inches.

Figure 3: Naiad and Thalassa block diagram.

(c)

(b)

(a)

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download