University of California, San Diego

Transcription

University of California, San Diego
Technical Paper
University of California, San Diego
Faculty Advisor: Dr. John Kosmatka
Graduate Advisor: Chad Foerster
Project Manager: Joshua Egbert
Computer Engineering Lead: Shane Grant
Safety Pilot: Thomas Hong
Team:
Matt Wu, Andrew Swingle, Dmitriy Obukhov, Tony Myers,
Peter Xu, Tim Wheeler, Daniel Bedenko and Lewis Anderson
May 2010
2009-2010 project manager:Joshua Egbert (jegbert@ucsd.edu)
2010-2011 project manager: Andrew Swingle (aswingle@ucsd.edu)
Abstract
This paper provides an overview of the design of UCSD’s UAS, designed to meet the requirements
of the 2010 AUVSI student UAS competition. The team’s custom built “Falco” airframe is
controlled by a Procerus Kestrel autopilot. While in the air, the plane flies through a designated
set of way points and then searches a field for targets using an HD Sony FBC-H11 block video
camera. The camera is mounted on a pan tilt gimbal system that points the camera at commanded
GPS locations. The video is streamed live to the ground digitally over a 2.4GHz link, where it
is processed and displayed to a user. Image processing takes place in two steps. First, regions
of interest in a frame are located, and then target parameters are extracted from these regions.
Telemetry data from the autopilot is synchronized with the video to provide accurate target
positions. Safety is a priority as the aircraft can be controlled autonomously via a ground station,
using a 900MHz pass through, or through a 72MHz R/C controller hard cutover. This paper
describes the design and development of the UAS in three components. First, the systems design
philosophy and its rationale is presented, then individual component design is described, and the
paper concludes with a description of the testing that has been performed to show that the mission
can be completed successfully.
Contents
1 System Overview
1.1 Team . . . . . . . . . . . .
1.2 Mission Requirements and
1.3 Brief System Overview . .
1.4 Competition Preview . . .
. . . .
Goals
. . . .
. . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
2 Flight Systems Design and Overview
2.1 Airframe . . . . . . . . . . . . . . . . . .
2.1.1 Introduction . . . . . . . . . . .
2.1.2 Design Methodology . . . . . . .
2.1.3 Competitive Assessment . . . . .
2.1.4 Design . . . . . . . . . . . . . . .
2.2 Autopilot . . . . . . . . . . . . . . . . .
2.2.1 Procerus Kestrel Autopilot . . .
2.2.2 Virtual Cockpit Ground Station
2.2.3 Software in Loop Simulation . .
2.2.4 Autopilot Safety Considerations
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
4
. 4
. 4
. 4
. 5
. 5
. 7
. 7
. 8
. 9
. 10
3 Payloads Design and Overview
3.1 Overview . . . . . . . . . . .
3.2 Cameras . . . . . . . . . . . .
3.3 On Board Processing . . . . .
3.4 Gimbal . . . . . . . . . . . .
3.5 Payload Communications . .
3.5.1 Wireless Links . . . .
3.5.2 Serial Data Links . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
10
10
11
12
13
14
14
14
4 Ground Systems Design and Overview
4.1 Graphical User Interface Software . . . . . . . . . . . .
4.2 Image processing . . . . . . . . . . . . . . . . . . . . .
4.2.1 Image Rectification . . . . . . . . . . . . . . . .
4.2.2 Autonomous Target Location and Identification
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
15
15
16
16
17
.
.
.
.
.
.
18
18
18
18
19
19
20
.
.
.
.
.
.
.
.
.
.
.
.
.
.
5 Testing, Performance and Safety
5.1 Individual Systems Tests . . . . .
5.1.1 Autopilot Testing . . . . .
5.1.2 Imagery Testing . . . . .
5.1.3 Airframe Testing . . . . .
5.1.4 Communications Tests . .
5.2 Full Systems Tests . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
2
2
2
2
3
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
6 Acknowledgments
21
6.1 Sponsors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
6.2 Faculty and Staff . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
1
SYSTEM OVERVIEW
1
1.1
System Overview
Team
The University of California San Diego’s AUVSI student UAS competition team is comprised
of an interdisciplinary group of students. The team is led by Joshua Egbert, an aerospace engineer
who has been an active member of the team for four years. Shane Grant leads the computer science
portion of the team which includes Lewis Anderson, Tony Myers, Doug Libby, and Irving Ruan.
The rest of the team consists of two aerospace engineers, Andrew Swingle and Tim Wheeler, two
electrical engineers, Peter Xu and Dmitriy Obukhov, and a mechanical engineer, Matt Wu. The
team’s faculty advisors are Dr. John Kosmatka, a professor in the Mechanical and Aerospace as
well as the Structural Engineering departments, and Dr. Ryan Kastner of the Computer Science
and Engineering department. Chad Foerster, a graduate student in structural engineering, serves an
advisory role in the areas of autopilot integration and project management. Additionally, Thomas
Hong acts as the team’s safety pilot and adviser on the airframe.
1.2
Mission Requirements and Goals
AUVSI’s 2010 student UAS competition, sponsored by Seafarer chapter, simulates a real world
mission as described in the rules for the 2010 competition. In the simulated environment, the UAV
must autonomously navigate through a set of waypoints, staying close to the pre-determined path
to avoid detection by a simulated enemy. There are two targets along the path, one located directly
under and a second located off to the side of the flight path, but with a known location. After
proceeding through the set of waypoints, the UAV enters a search area at which point it must navigate
and explore the area to locate an identify an unknown number of targets. Finally, to simulate real
world re-tasking, the search area will change during the mission and an additional target that is not
a pre-determined shape must be located.
The UCSD team’s goal for 2010 was not only to design a UAS with the capability to complete
all of the mission objectives while maximizing autonomy, reliability, and safety, but to also design
a UAS that replicates the functionality of a real world UAV. To accomplish both of these goals the
team made drastic design changes including an overhaul of the imaging system despite the success of
the imaging system in 2009. All the imagery payloads and the ground systems run independently of
flights so that safe flight through the autopilot or safety pilot link are not compromised if any vision
component fails.
1.3
Brief System Overview
UCSD’s UAS is made up of three subsystems: flight systems, payloads, and ground systems.
The flight systems include the airframe custom built in 2009 to meet the mission requirements, and
the autopilot which is responsible for the autonomous operation of the airframe. The autopilot is
managed via a ground station laptop which allows the operator to control the airplane, upload flight
plans, and view aircraft diagnostics. The payloads are comprised of a microcontroller to handle input
and output from the various devices and wireless communications, a camera gimbal system, a power
supply, and transmitters. The camera gimbal system uses a single Sony FBC-H11 block camera that
can shoot video in up to 1080i quality with automatic focus, white balance and serial controlled
zoom. The gimbal for the camera was designed and built by the team to accomplish the mission.
It is capable of ±90◦ tilt and continuous 360◦ pan with a resolution of .225◦ . Each component of
2
UC San Diego AUVSI
1
SYSTEM OVERVIEW
1.4
Competition Preview
the payloads, including the autopilot, is housed in a separate custom built container on the airplane
with standard interconnects for power and communications for ease of maintenance.
The ground systems include the software and hardware used to process video, identify targets,
and control the camera gimbal system on the airplane. The ground station includes a video capture
card that collects the video and feeds it into a graphics processing unit (GPU) to automatically locate
and identify targets within the image. The ground station computer also receives a serial feed from
the microcontroller that contains all the telemetry data necessary to geo-reference the pixels in each
video frame so that the GPS coordinates of the targets can be located.
1.4
Competition Preview
Our UAS is designed both for performance and ease of use. Easy-access panels on the top of the
airframe and a compartmentalized payload configuration allow for quick troubleshooting if either the
payloads or the autopilot are not functioning properly. The RC safety pilot acts as pilot-in-command
and has the final say on aircraft airworthiness.
A powerful two monitor computer for imagery and a laptop for the autopilot have separate
communications channels to the aircraft which are checked before each flight. The autopilot’s ground
station is used to program the takeoff procedure and the initial navigation route (which can be
dynamically changed in flight if necessary). The image station computer positions the camera so
that the lens is safe for takeoff and begins processing video.
The project manager will go through a final safety checklist with the team to verify the go-no
go criteria. Upon successful completion of this, the UAS will be readied for its mission by arming
its flight motor. The autopilot ground station gives the command to take off . After an autonomous
take off , the aircraft transitions into a loiter while the team performs diagnostics, ensuring control
and communications are satisfactory for mission completion. The image station then instructs the
camera gimbal to operate in stabilization mode, where the gimbal’s position compensates for the
airplane’s attitude. Simultaneously, the autopilot is commanded to begin waypoint navigation. At
this point, the mission is fully underway.
As the UAV navigates through its waypoints, the video camera wirelessly streams video in real
time and the on board microprocessor synchronizes telemetry data with the video. This video passes
through an image processing algorithm running on the image station computer, the results of which
are sent through a character and shape recognition algorithm that automatically returns the target
parameters. The targeting operator points the camera using a joystick, which instead of directly
controlling the camera gives GPS target locations, and on board calculations performed on the
microprocessor in the air point the camera at the desired GPS coordinate. When a target is located
using the search, the camera will be fixed on the target’s GPS point then zoomed in to provide a
better quality image of the target.
The UAS is fully capable of changing the search pattern, or exploring a new location as required
by the mission since this ability is critical in a real-world environment. Once all the targets have
been found, the operator commands the UAS to land on a specified runway after which the motor is
disarmed thus completing the mission.
3
UC San Diego AUVSI
2
FLIGHT SYSTEMS DESIGN AND OVERVIEW
2
Flight Systems Design and Overview
2.1
2.1.1
Airframe
Introduction
Figure 1: The UCSD Falco Airframe, shown in payload access and flight configurations
The UCSD Falco is an electric tractor propeller aircraft with a mid-placed wing, conventional
tail, and tricycle landing gear. The Falco has a steerable nose gear and full aerodynamic controls
with the addition of inboard flaps on the wings. With a wingspan of 10.25 ft and a length of 6.2
ft, the aircraft has a no-payload takeoff weight of 17 lbs, a maximum takeoff weight of 42 lbs, and
a nominal mission takeoff weight, NMTW, of roughly 28 lbs. At NMTW, the aircraft has a clean
configuration stall speed of 20 knots, a cruising speed of 42 knots for maximum efficiency, and a dash
speed of 80 knots with a flight duration of over 30 minutes.
The airframe meets and/or exceeds all requirements placed by the AUVSI Seafarers Chapter
and the Academy of Model Aeronautics. Its design in 2009 was the direct result of interaction with
the current and past members of the UCSD AUVSI team as well as market projections for a low
endurance research UAS airframe. Considering the airplane meets the mission requirements and is
comparable to real world UAV airframes the team chose to focus on the development of other mission
critical systems for the 2010 competition. Nevertheless, a synopsis of the airframe design process
and abilities will be presented to provide a comprehensive of the picture of all facets of our system
design.
2.1.2
Design Methodology
The team demanded a portable electric airframe capable of carrying easily accessible, heavy
and voluminous payloads. Portability would aid in transportation as well as reduce shipping costs
a realistic concern for a San Diego based team. Electric propulsion reduced the vibrations caused
by a reciprocating engine and the associated exhaust residue which was a hassle for transportation,
maintenance/repair, and cameras. Past experience with various airframes showed that payloads
accessibility and volume were top-level concerns. High load capacity would allow room for payload
and airframe weight growth in future years.
Key performance parameters for the competition mission were payloads capacity, cruise velocity,
flight duration, load-factor, and good handling characteristics (table 1).
4
UC San Diego AUVSI
2
FLIGHT SYSTEMS DESIGN AND OVERVIEW
Parameter
Payloads Weight
Payloads Volume
Cruise Velocity
Flight Duration
Load Factor
Safety Factor
Specification
>10
>1000
<45
>25
±5
>1.2
2.1
Airframe
Units
Pounds
Cubic Inches
Knots
Minutes
G’s at MTOW
Table 1: Team derived key performance parameters
2.1.3
Competitive Assessment
The Falco must be competitive during the AUVSI competition, but its design must also be
justified with a competitive assessment of the current UAV airframe market. Competition included
the ACR Manta, Raytheon Cobra, Arcturus T-15E, Optimum Solutions Condor 300, and the UCSD
modified Sig Rascal 110 (figure 2). These aircraft had capabilities similar to requirements set forth by
the team, but were inadequate for one reason or another. The commercial UAV airframes, excluding
the Sig Rascal, were far too costly with only the T-15E being electric powered. All five airframes
required large shipping crates and a station wagon or SUV for short-haul transportation. The T-15E
would need landing gear for AUVSI operations. The Sig Rascal 110 was previously used and modified
by the UCSD team to electric propulsion and tricycle gear, but it still lacked the payload volume and
easy payloads accessibility. In order to be competitive, the airframe needs to be cheaper to produce,
easier to ship and transport, offer a reasonable payload volume and weight, and maximize structural
and flight efficiency. Specifications for the competitors along with the Falco are listed in table 2.
Figure 2: Competitors from top right CW: UCSD Sig Rascal 110, Raytheon Cobra, ACR Manta,
Arcturus T-15, Optimum Solutions Condor (center)
2.1.4
Design
The Falco airframe is based roughly on the Sig Rascal 110, so many of its parameters have a
grounded basis in a proven design. As an example, the handling qualities of the Sig Rascal are
well-liked by the team’s safety pilot so the same tail volume coefficients are used on the Falco.
A conventional layout and similar sizing to the Sig Rascal also simplified the transition of the
5
UC San Diego AUVSI
2
FLIGHT SYSTEMS DESIGN AND OVERVIEW
Parameter
Wing Span
Length
Propulsion
MTOW
Payload weight
Payload volume
Cruise velocity
Units
Feet
Feet
Type
Pounds
Pounds
cu-in
knots
UCSD Falco
10.25
6.5
Electric
42
25
1600
33-45
Manta
8.8
6.2
Gas
61
15
776
39-90
2.1
Cobra
10.2
9.3
Gas
>100
45
2600
50-60
Condor
10.6
6.2
2xElectric
40
13
<1000
49
T-15E
10.8
6
Electric
45
10
800
50
Airframe
Sig110
9.2
6.3
Electric
25
18
<1000
35-45
Table 2: Key flight parameters of competitors listed for comparison
Kestrel autopilot from the Sig Rascal to the Falco. The mid-placed wing is a compromise between
static lateral stability and payloads access. With this wing placement, the center of gravity is kept
below the wing while the entire upper surface of the fuselage can be opened up for very easy access
to payloads. The wing is a triple-tapered design which mimics the efficiency of an elliptical planform
while maintaining the manufacturability and stall characteristics of a rectangular or single/double
tapered wing. A rectangular main spar with a single rectangular carry-through joiner will increase
structural efficiency when compared to round spars and joiners used on some competitive airframes.
Thorough aerodynamic performance analysis was performed on XFLR5, an airfoil and aircraft analysis tool for approximating properties at low Reynolds numbers (figure 3) before the configuration
was finalized [1]. The fuselage and tail boom configuration allow the aircraft could fit into a mid-size
Figure 3: XFLR5 allowed the aerodynamic design to be refined with relative ease
sedan when broken down so that more members of the team can transport the aircraft, simplifying
flight test logistics. Both the vertical and horizontal stabilizers detach from the boom to further
decrease shipping and stowing volume.
The Falco’s tricycle landing gear layout provides good ground stability for the autonomous takeoff
and landing of the aircraft. This layout also protects the camera gimbal and propeller from impact
during landing. The nose and main gear can be removed by using one allen-key each in order to
reduce shipping volume.
The electric propulsion system is the same system used on the Sig Rascal in past years. The
system chosen is a Neutronics Corporation 1515-2y motor with a 6.7:1 gearbox controlled by a Castle
Creations Phoenix HV-85 speed controller. The aircraft uses a standard APC propeller, or a stiffer
Mezjlik hollow-molded carbon fiber propeller, which offers an increase in propulsion efficiency. A
6
UC San Diego AUVSI
2
FLIGHT SYSTEMS DESIGN AND OVERVIEW
2.2
Autopilot
total of 12 5500mAh lithium-polymer battery cells in series provide the aircraft with a nominal 44.4
volts and a maximum installed current draw of 65 amps. The propulsion system has been shown
in flight testing to allow for missions of over 30 minutes. A summary of all aircraft parameters is
provided in table 3.
Parameter
Wing Span
Length
Height
Wing Area
Aspect Ratio
Usable Payload Volume
Maximum Payload Weight
Structural Weight
Empty Takeoff Weight
Nominal Mission Takeoff Weight
Maximum Takeoff Weight
Load factor at MTOW
Flight Duration at NMTW
Stall, Cruise, Max Velocity at NMTW
Propulsion
Power
Units
Feet
Feet
Feet
Square Feet
Cubic Inches
Pounds
Pounds
Pounds
Pounds
Pounds
Gs
Minutes
Knots
Specifications
10.25
6.50
2.35
10.24
10.20
1600
25
9.2
17.3
25-28
42
±5
>30
20, 33-45, 80
Neu 1515-2y, 6.7:1 Gearbox, Mezjlik 20x11w Prop
2x Neu 5500 6-cell Lithium Polymer Packs
Table 3: Falco Specifications
2.2
2.2.1
Autopilot
Procerus Kestrel Autopilot
One of the primary mission objectives for the competition is autonomy, so an autopilot is needed
to control the airframe for the mission. The requirements for an autopilot are:
• Robust autonomous control:
Autonomous waypoint navigation
Autonomous takeoff and landing capabilities
Reprogrammable in mission
• Ease of integration with a custom built airframe
• Cost effectiveness
• Ability to interface with a serial device
Using these criteria, the team selected Procerus Technology’s Kestrel autopilot. It weighs under
17 grams and it has all of the capabilities necessary to complete the mission. Several teams have used
this autopilot in past years and the UCSD team used it during the 2008 and 2009 competitions. The
7
UC San Diego AUVSI
2
FLIGHT SYSTEMS DESIGN AND OVERVIEW
2.2
Autopilot
team has demonstrated in testing and at the 2009 competition the ability to takeoff, navigate, and
land autonomously. A major design change from previous years was to use the autopilot as the sole
sensor suite on the airplane for attitude and GPS for all systems, thus consolidating measurements
for flight controls and imaging, which had been separated in previous iterations of the project.
GPS, IMU, Pressure
Magnetometer
Laser Altimeter
900 MHz
link
Kestrel
2.4 GHz R/C
Commands
Telemetry
RX Mux
Virtual
Cockpit
Servos
Figure 4: Command, communications, and control structure for the UCSD UAS
The autopilot hardware uses a suite of sensors to provide the necessary data for controlling
the aircraft, including a magnetometer, three gyroscopes, three accelerometers, dynamic and static
pressure ports and a GPS receiver. The Kestrel uses an onboard 8-bit 29MHz processor to handle
all the sensor data and run the code necessary for controlling the aircraft via four servo ports on
the Kestrel unit and eight additional servo ports on a servo expansion board. The Kestrel uses
a Maxstream 9XTend 900MHz 1W wireless modem to communicate with the ground station; this
connects to the bottom face of the autopilot’s PCB board through the modem header. The autopilot
is housed in an aluminum enclosure with a fan so that it remains cool and protected.
Figure 5: The Kestrel autopilot.
2.2.2
Virtual Cockpit Ground Station
The ground station for the Kestrel autopilot consists of a Commbox and Virtual Cockpit software. The software is loaded on a laptop connected to the Commbox via RS232; which in turn
communicates with the autopilot via a Maxstream wireless modem. The Commbox has its own internal battery, allowing the ground station to be used in the field without external power. A standard
R/C transmitter interfaces with the Commbox through a trainer cable, allowing manual control of
the aircraft through the Kestrel.
8
UC San Diego AUVSI
2
FLIGHT SYSTEMS DESIGN AND OVERVIEW
2.2
Autopilot
The Virtual Cockpit (VC) allows the user to configure the Kestrel and safely operate the unit
during autonomous and semi-autonomous flight . The team used the VC to setup the Kestrel for the
specific servo arrangement on the aircraft and calibrate the sensors on the Kestrel for its particular
mounting arrangement within the airframe. The VC also features a PID gain window, used to tune
the various PID control loops for acceptable response under steady flight.
Waypoint navigation is activated through the VC. The user uploads a list of waypoints or dynamically creates a list by clicking on a geo-referenced image file. In addition, the user can manually
enter additional waypoint parameters if needed. The VC and Kestrel have the ability to modify the
list of waypoints in flight, allowing for dynamic retasking.
A heads up display with relevant aircraft parameters and modes is present in the main window
of the VC at all times for the benefit of ground station personnel (figure 6). Telemetry data for
each flight is automatically stored by the software for post analysis and replying the flight in VC.
More detailed data can be extracted and logged from the various sensors on the autopilot through a
data logging window in VC. This option allows for precise analysis of the sensor system outputs for
troubleshooting.
Figure 6: Image of the autopilot ground station showing a Google Earth image of one of the fields
where extensive testing was done for Kestrel autopilot. The orange outline is a flight path that was
used during testing.
2.2.3
Software in Loop Simulation
After the 2009 competition there were two issues that needed to be addressed with the autopilot.
There were problems with higher mode transitions between takeoff, and the second is that there were
long period altitude oscillation observed during flight. These issues could have been fixed with flight
testings, but due to university concerns about performing autonomous flights in the San Diego area,
fully autonomous flight testing have been limited in preparation for the 2010 competition compared
to previous years.
In order to deal with this challenge a software in loop simulation was developed to interface
the Virtual Cockpit software with the RealFlight 4.5 simulator. Using the mass properties and
9
UC San Diego AUVSI
3
PAYLOADS DESIGN AND OVERVIEW
aerodynamic characteristics of the UAV a model was developed for the simulation environment in
RealFlight 4.5. From there the PID gains were tuned in the virtual environment to eliminate the
two outstanding issues the team observed in the previous configuration. The simulation environment
also allowed the team to practice using the autopilot in order to be fully prepared to perform an
autonomous flight. When the team finally had an opportunity to fly the airplane autonomously the
system showed improved performance as compared to the 2009 competition configuration.
2.2.4
Autopilot Safety Considerations
Autopilot safety is realized in two ways in the UAS. The first is the built in autopilot failsafe
that govern its behavior if communication with the ground station is lost. The second is an RX-Mux
(receiver multiplexer), which allows the RC pilot to take control of the airframe at any point during
the flight.
The autopilot failsafes that are used are in correspondence with the rules set up by AUVSI for
the competition. After 30 seconds of no communications the autopilot will return to home and loiter,
and after three minutes without communications the autopilot will put the airframe into a spiraled
dive. The RxMux (figure 9) allows for one set of servo outputs to be controlled by two different
sources. The autopilot and the RC pilot are input signals to the mux and the RC pilot specifies
which input is used. Therefore, if anything goes wrong at any point in the mission, or the autopilot
is not behaving as expected, the RC pilot can take control. This capability is especially valuable
when first integrating the autopilot with the airframe.
Figure 7: RxMux board without any servos attached
3
3.1
Payloads Design and Overview
Overview
The payloads for the UAV are a Sony FBC-H11 block camera, a gimbal assembly that houses
and points the camera, a Digi Rabbit microprocessor that handles all the inputs and outputs for the
payloads, a power supply unit, and a digital video transmitter. The goal of the payloads system is
to stabilize and control the camera gimbal assembly such that the camera points at a commanded
GPS position on the ground and can zoom in to collect high resolution target data.
The payloads system is designed to be modular with each individual component having its own
enclosure to allow for easy of trouble shooting and organization of the payloads. The payloads system
also interacts with the autopilot over a serial link to receive telemetry data as seen in figure 8, and
then transmits all necessary data to the ground system for processing.
10
UC San Diego AUVSI
3
PAYLOADS DESIGN AND OVERVIEW
3.2
Cameras
Figure 8: Block diagram of the three systems, with arrows indicating flow of information.
3.2
Cameras
One of the primary objectives of the competition is to locate targets and identify the following
parameters: background color, shape, orientation, alphanumeric, and alphanumeric color. In our
design we considered a single still still digital camera, multiple USB cameras, an analog video camera,
and a high definition block camera. The functional requirements for identifying targets are:
• Resolve the targets at altitudes up to 500 ft
• Lightweight and small enough to physically mount in the airframe
• Take color image/video
• Transmit video/images to the ground or on board computer in real time
• 120 degree field of view to locate any off-path targets
Considering the team had committed to designing a pointing gimbal, the field of view restrictions
could be met with physically moving the camera instead of the solely with camera’s field of view.
Therefore, video quality and features such as remote zoom capability, auto-focus, and auto-white
balance were the most important aspects in selecting a camera, which lead to the Sony FBC-H11
block camera being used in the system as seen in table 4.
The Sony camera can stream video in varying quality between standard definition NTSC up
to 1080i HD. The limit on video quality is therefore in the transmitter used to stream the video
from the ground, which is discussed later. The camera also has auto-focus and auto-white balance
features which allow for good video to be taken in any lighting conditions that would be expected
at competition. Another important feature of the camera is that it has up to 10X optical zoom
which can easily be controlled by the microprocessor over the serial link, and this allows for sufficient
resolution for target identification from any altitude allowed during the competition
11
UC San Diego AUVSI
3
PAYLOADS DESIGN AND OVERVIEW
Image Quality
Features
Weight
Size
Lens Distortion
Cost
Power Consumption
Ease of Interfacing
Total
Weight
4
3
2
2
4
1
1
1
FBC-H11
4
4
3
3
2
1
3
3
55
3.3
Canon S3-IS
3
3
1
2
3
4
2
1
38
On Board Processing
TSC-TC33USB
2
2
4
4
1
4
4
2
36
Analog
1
1
2
1
4
1
4
34
Table 4: Camera trade study between Sony FBC-H11 block camera, Canon S3-IS digital still camera,
Sentek TSC-TC33 USB video cameras, and a Blackwidow AV analog video camera.
3.3
On Board Processing
For the payloads system to perform the mission the camera, autopilot, and gimbal need to
function together, and therefore an on board microprocessor is needed in order to handle serial
communications between the three payloads devices and the ground. The requirements for this
processor were:
• 4 serial ports: 3 for payloads devices, 1 for wireless link to ground
• Lightweight and low power consumption
• Processing power to do gimbal calculations
• Parallel processing capability
• Inexpensive
Figure 9: Rabbit BL1800 processor.
In 2009, the team interfaced with the payloads system computer with a 1.8GHz Intel Pentium
processor, which requires a full operation system to run. This has more processing power than is
necessary to meet the functional requirements of the on board processor, ands its power consumption
12
UC San Diego AUVSI
3
PAYLOADS DESIGN AND OVERVIEW
3.4
Gimbal
was very high. Therefore, this year the team chose to use a Rabbit BL1800 processor in the system
because it meets the requirements with the minimum amount of processing power.
The Rabbit BL1800 processor stands out from other microcontrollers for its ability to handle
serial communication. There are four serial ports available on the Rabbit, and each serial port has
a dedicated pin connection, which exactly fits the systems requirements. Also, unlike many other
microcontrollers, the rabbit is programmable using a relatively high level language, Dynamic C,
which is similar to the standard C programming language. Dynamic C allowed the team to develop
a lightweight operating system which managed the several concurrent processes the microcontroller
was tasked with. Finally, there are no moving parts involved with the Rabbit processor and its main
board, which makes the system very compact and allowed it to fit inside an relative small enclosure.
3.4
Gimbal
The camera gimbal is an essential part of the payloads system because it allows a user to use
the camera to locate and identify the ground targets in support of the overall mission. The camera
was selected before the gimbal design was finalized, and therefore it drove a few of the requirements
for the gimbal design. The rest of the requirements were driven by standard flight conditions of the
airframe. These requirements for the design were:
• 360◦ continuous rotation in azimuth direction
• ±90◦ rotation for elevation
• 1.5◦ resolution in both degrees of freedom
• 30◦ /s elevation change
• RS-232 controllable
Figure 10: Right: CAD model of the gimbal assembly. Left: Assembled gimbal mounted underneath
the airplane.
With these minimum requirements in mind a two degree of freedom gimbal using stepper motors
to drive its motion was designed and fabricated as seen in figure 10. The finished gimbal met or
13
UC San Diego AUVSI
3
PAYLOADS DESIGN AND OVERVIEW
3.5
Payload Communications
exceeded all of the requirements, with the resolution on the gimbal pointing being .225◦ in both
degrees of freedom.
The gimbal has five operational modes all controlled by the rabbit microprocessor which takes inputs from the autopilot and receives control signals from the ground station: manual mode, stabilized
mode, lock on target mode, look down mode, and takeoff/landing mode. In manual mode the gimbal
is controlled manually through joystick operated from the ground station. The joystick provides an
intuitive and cost effective way to manually control the motion of the camera. The manual mode
does not automatically compensate for the motion of the plane, and therefore its primary purpose is
initial testing and troubleshooting.
While in stabilized mode the gimbal maintains position with respect to the heading of the
airframe, and allows user input to change the offset. In stabilized mode, the gimbal compensates for
pitch and roll of the airplane,and this mode is use during mission when searching for targets. Once a
target has been located using stabilize mode, target lock mode is activated, and the gimbal maintains
lock on a GPS position in order for the camera to zoom in and get a better image of the target.
In the look down mode the gimbal returns to a predetermined downward looking position. This
command quickly centers the camera to a known position so that the user can easily have a good
reference for the camera is pointing. Finally, in takeoff/landing mode the gimbal points the camera
in a direction that would protect the camera and the lens from any damage that might occur due to
debris on the runway. In this mode the camera is pointed straight up into the chassis.
3.5
3.5.1
Payload Communications
Wireless Links
There are a total of four wireless links from the ground to the airplane during a mission. The
autopilot communicates with the Virtual Cockpit ground station over a 900MHz link. The rabbit
microprocessor also uses a 900 MHz link, on a different channel, and both of these 900MHz links are
wireless serial connections. The safety pilot uses a standard 72MHz RC controller data link so that
he can take over control of the servos if there is anything wrong with the mission.
The last data link to the ground is the video link, which is a 2.4GHz Coded Orthogonal Frequency
Division Multiplexing (COFDM) video link, which is a digital video signal. The transmitter used to
maintain this video link was loaned to UCSD by Broadcast Microwave Services, and it allows video
to be transmitted digitally instead of using an analog signal. This transmitter eliminates noise in the
video which was observed in the team’s video during previous years, and therefor it provides much
better video quality on the ground. The transmitter, however can only transmit NTSC video which
is 720X486 resolution and therefore this is the bottleneck in the video quality.
3.5.2
Serial Data Links
There are four serial links between the Rabbit microprocessor and the payloads systems, and
these operate at the following baud rates and logic levels:
• Camera : RS-232, 38400
• Autopilot: TTL , 115200
• Gimbal: RS-232, 9600
14
UC San Diego AUVSI
4
GROUND SYSTEMS DESIGN AND OVERVIEW
• Wireless modem to ground station: CMOS, 115200
The microprocessor runs an operating system programmed by the team which cycles through the
various commands it sends and receives at a rate of approximately 30Hz so that control of the
payloads by the ground station is fluid.
4
4.1
Ground Systems Design and Overview
Graphical User Interface Software
The ground station software is entirely custom written for this competition. It is encapsulated
into one program, referred to as the “Image Station.” The main purpose of the Image Station is
to provide a graphical user interface (GUI), figure 11, displaying essential flight information in an
intuitive and organized fashion. It is the processing core of the entire imaging pipeline because it
runs on specialized hardware that can handle complex computation in real time.
Figure 11: The two monitor graphical user interface developed for displaying and analyzing target
data. The left columns display telemetry data and warnings, in the middle are video options and the
real time video stream, and the right is a map displaying the current flight.
The image station displays a constantly updating feed of live video from the airplane. It is
designed in such a manner that this video can be overlayed with various information such as GPS, or
the video can be toggled to display currently active image filters such as saliency. Video is acquired
through a capture card which is agnostic to input format - this means that we can connect virtually
any type of video device to our system and have it perform in a predictable fashion.
The video data is constantly being processed by the imaging pipleine, which first calculates a
saliency map to extract small regions of the image to be run through the more intensive object
recognition algorithms. This data is saved into a database and classified into one of three categories:
“candidate regions,” which contains interesting regions as determined by saliency,“possible targets,”
which contains autonomously identified targets, and finally “verified targets,” which are “possible
targets” that have been manually confirmed by the human operator.
In addition to a heads up display style of displaying airplane telemetry data, the data is also
displayed in a tabular fashion that updates frequently. This information is logged to disk so that it
15
UC San Diego AUVSI
4
GROUND SYSTEMS DESIGN AND OVERVIEW
4.2
Image processing
is available for later analysis. All input data to the image station comes from a wireless serial link
to the microcontroller on the airplane. The image station requires no other connectivity to operate;
it simply needs a video source and the data link.
An operator can manipulate the gimbal by using either a game pad or the keyboard and mouse.
For example, moving the game pad joystick causes an instantaneous response to the gimbal position which can be observed by watching the video display or looking at the telemetry data that is
constantly updated. Additional functionality include changing the camera zoom level and entering
various modes of operation for the gimbal.
The image station also possesses the capability to record video and quickly export images to
disk. Video is automatically compressed and split into manageable sized files.
Finally, the image station renders a map that tracks the current position of the airplane, current
viewing area of the camera, and targets found. The user can interact with this map to send specific
GPS coordinates to the gimbal to lock to, or add and remove targets manually.
4.2
4.2.1
Image processing
Image Rectification
An essential task for imagery captured from an aerial platform is to exactly determine the
physical location of any point in an image - a process known as georeferencing. In order to find
the latitude and longitude of a potential target, it was necessary to perform a series of coordinate
transformations to take the camera position vector to a local WGS 84 Cartesian coordinate system.
Under this WGS 84 coordinate system, actual physical locations can be derived for any point in the
image. To accomplish this, the following rotation matrix was constructed:
R = Rcamera,gimbal Rgimbal,platf orm Rplatf orm,unrotated Runrotated,W GS84local
Where each Rij takes one frame of reference to another coordinate system. The individual Rij
were constructed either via Euler angles or simple change of basis matrices that converted from
North-East-Down to North-West-Up coordinate systems.
With this position vector in the local WGS 84 coordinate system, we can then find the location
of the airplane based upon its GPS data in this same coordinate system. With this last piece of
information, we are able to project any point in the image onto the earth and derive its physical
location. Given our low altitude, we make the simplifying assumption that the earth is a flat p lane,
which causes negligible error and reduces computation complexity significantly.
Using this information, we can use the four corner points of the image to create a homography a transformation mapping points on the image to points on the earth plane. Doing this removes the
effects of projective distortion and gives our image the appearance of being viewed from directly overhead from a perpendicular viewing location. The importance of this is that before under projective
perspective, shapes such as squares may have appeared as any arbitrary quadrilateral. After rectifying the image, right angles are restored to how they should look as well as parallel lines, meaning
that circles look like circles and squares look like squares. This is essential for performing automated
target recognition.
16
UC San Diego AUVSI
4
GROUND SYSTEMS DESIGN AND OVERVIEW
4.2.2
4.2
Image processing
Autonomous Target Location and Identification
Given the fact that even when operating at standard definition the onboard camera captures
320mb of data per second, we needed an approach to filter this data quickly and reliably. The
solution the team chose to use was saliency, a biologically inspired approach for finding regions of
interest in an image. By imitating the methods that a human eye uses to effortlessly find interesting
regions in the field of view, the saliency algorithm identifies regions of each frame that stand out
based upon various types of contrast including color and edge intensity.
Since the targets we are searching for are well defined shapes (strong edges) that consist of solid
colors they will inevitably cause a large contrast with their background (mostly natural objects such
as grass and foliage, which are all closely colored and have no well defined edges in general). The end
result of this is that for each frame of video we autonomously filter out an overwhelming proportion
of it as “non interesting” and do not process it to look for an actual target. This allows us to keep
our analysis real time. The output of the saliency algorithm is a heat map of sorts which displays
more “interesting” areas as bright regions, while regions we deem “unintersting” are darker as seen
in figure 12.
Figure 12: Example of the Saliency algorithm running on a set of practice targets. The targets
appear as bright white regions in this noisy example, and these regions would be fed through the
autonomous identification algorithms.
The saliency algorithm itself is implemented to run on a graphics processing unit (GPU), which
allows for massively parallel computation allowing us to achieve real-time analysis for both standard
and high definition images. An additional advantage using the GPU is that we can perform other
CPU bound processes without the overhead of saliency processing to slow down computation.
It is important to note that saliency does not make any subjective discrimination as to what type
of objects it finds “interesting” or not. The algorithm has no knowledge of what a target is, what
constitutes background noise, or what we would like it to find interesting. It simply uses biologically
motivated conditions for finding a region “interesting” that match what the human visual system
would do given a similar situation. This means that some regions determined as interesting may in
fact be irrelevant to our purposes, though the chances of a target region being deemed unintersting
are, in our testing, very low.
Additionally, we can exploit knowledge about targets to further reduce the output of the saliency
17
UC San Diego AUVSI
5
TESTING, PERFORMANCE AND SAFETY
Figure 13: Two samples of the autonomous target and shape recognition algorithm which show the
shape and alphanumeric being extracted from the full color image of the target.
algorithm - domain specific knowledge such as expected physical dimensions.
From an operator perspective, saliency makes searching for targets a much more feasible in a
time constrained environment - it acts like a second set of eyes scanning over images and finding
things to look at. By highlighting potential targets in real time, the human operator can perform
other tasks while watching the video stream without fear of missing a target.
Once these interesting regions are found via saliency, they are processed using a trained algorithm
that performs more complex analysis to autonomously determine target parameters such as location,
shape, letter, and color information.
5
Testing, Performance and Safety
5.1
5.1.1
Individual Systems Tests
Autopilot Testing
The UCSD team began the 2009-2010 year with a functioning autopilot that could perform
all of the mission objectives. Because the university was concerned about legal and liability issues
regarding autonomous flying, the primary method of testing the autopilot for most of the year was
software in loop simulations. However, the team did perform test flights near the end of the year to
ensure that everything was performing as expected.
5.1.2
Imagery Testing
To test the image systems in a realistic setting the team constructed over 20 competition style
targets, and testing for the imagery systems was done in segments as functionality was developed.
Initially, the camera was used to take pictures of targets from the roof of buildings on campus to
collect sample target data. Then the camera was statically mounted in the airframe to judge image
quality from the air. However, until the gimbal was installed in the airplane it was difficult to locate
18
UC San Diego AUVSI
5
TESTING, PERFORMANCE AND SAFETY
5.1
Individual Systems Tests
targets on the ground. Once the gimbal was completed and installed in the airplane additional testing
was performed, which showed that targets could be identified from a reasonable altitude even without
zoom implemented. Finally, remote zoom on the camera was implemented, which allowed all targets
to be identified.
5.1.3
Airframe Testing
The first Falco airframe was constructed 2.5 lbs below its estimated weight of 19.5 lbs and met
its static proof load of 6 Gs. Assembly time out of the shipping crate was below 5 minutes requiring
nothing more than 3 allen-keys. Taxi testing proved that the aircraft absolutely did not tip over
under maximum lateral forces thus protecting the wing-tips and propeller.
The first flights were conducted without payloads at its empty flying weight in order to learn
its handling characteristics and expand its flight envelope. With the center of gravity at its forward
limit, the aircraft was not able to stall while still being able to rotate well under takeoff speed and
flare upon landing. The control surfaces provided excellent control and authority, and the aircraft did
not require any trim due to the team’s extensive aero-analysis. Drag was significantly reduced when
compared to the Sig Rascal; this was noticed by the 30% decrease in battery usage when compared
to flights of the same duration and takeoff weight. The empty aircraft (more susceptible to winds
and gusts) was flown in crosswinds at 17 knots gusting up to 22.
The only fix needed for the airframe since its initial design on the drawing board was that the
motor’s anti-vibration bulkhead needed to be strengthened. A dead-stick landing off the runway
due to this problem showed that the structures were durable enough while still maintaining a very
competitive empty weight fraction. Again, after flight testing, the airframe was able to be broken
down in less than 5 minutes and transported in a mid-sized sedan with an open seat for a passenger.
Storage and shipping volume can be seen in figure 14.
Figure 14: Shipping configuration; fuselage, tail and boom are between the wings
5.1.4
Communications Tests
With four different wireless signals being transmitted from the air to the ground, testing to
ensure that there was no interference was essential. Initially, the team’s safety pilot wanted to use
a 2.4GHz RC controller to control the airplane, however this was in the same frequency spectrum
as the video link. Therefore, ground range testing was performed which showed that the safety
19
UC San Diego AUVSI
5
TESTING, PERFORMANCE AND SAFETY
5.2
Full Systems Tests
pilot intermittently would lose control of the airplane due to the video transmitter saturating the
frequency. This testing drove the decision to place the safety pilot in the 72Mhz frequency spectrum
to avoid any potential conflict between signals on the plane and the safety line. Tests were also done
to ensure the two different 900Mhz links on the airplane did not cause any harmful interference.
There are multiple channels to us in the 900Mhz band, so this turned out not to be a very large
issue.
Another set of communication tests that was performed were range tests for each of the individual
signals. The safety pilot’s 72MHz signal is a standard RC link and worked well as soon as it was
implemented. Range test were performed in the air for the 2.4GHz video link, and it was found that
the link was solid up to the desired range, but it was sensitive to being blocked by carbon in the
airframe. Therefore, the antenna was moved out to the tail where line of sight was always maintained.
Finally, range tests were performed across the UCSD campus with the rabbit’s 900MHz link, which
showed that even across the ground the connection could be maintained for the distance required by
the competition.
5.2
Full Systems Tests
Full system tests were performed to show that all of the individual components could function
together in the airframe. Also, it was necessary to show that the communications could function
simultaneously. All of the integrated components can be seen in figure 15. The full systems test
were performed locally in San Diego at an RC field with the autopilot on board processing GPS and
attitude, but not controlling the airplane. Targets were placed in the field for each test and as gimbal
functionality was increased better images were collected. Video and stills from all flight tests were
captured using the ground station computer to be analyzed later to determine areas where the image
systems could be improved.
Progressing towards a final competition ready state the team intends to perform fully autonomous
practice missions These final tests will focus on completing simulated setup and mission within the
time constraints placed on the competition. Full target messages will be spelled out, and the members
of the team operating the ground systems will be able to locate them.
Figure 15: The image on the left shows the payloads systems after being fully integrated with the
airframe. The far left is the video transmitter, the three aluminum boxes are the rabbit microprocessor, power supply, and autopilot enclosures. At the right of the image are the gimbal and its
drivers.The image on the right is 2009’s configuration, which is much more disorganized than the
enclosure design used this year.
20
UC San Diego AUVSI
REFERENCES
6
Acknowledgments
6.1
Sponsors
6.2
Faculty and Staff
We would like to thank the following members of the UC San Diego community for all of the guidance
that they have given our team:
• Professor John Kosmatka, Faculty Advisor, Structural and Mechanical/Aerospace Engineering
• Dr. Mike Wiskerchen and Tehseen Lazzouni, California Space Grant Consortium
• Chad Foerster, Graduate Student, UCSD SE Department
• Professor Ryan Kastner, Computer Science and Engineering
• Thomas Hong, Graduate Student, UCSD SE Department
• Ben Ochoa of 2d3 for help with geo-referencing
• Scripps Institution of Oceanography Machine Shop
References
[1] XFLR5. http://xflr5.sourceforge.net/xflr5.htm.
21
UC San Diego AUVSI