When a firefighter, first responder or soldier operates a small, lightweight flight vehicle inside a building, in urban canyons, underground or under the forest canopy, the GPS-denied environment presents unique navigation challenges. In many cases loss of GPS signals can cause these vehicles to become inoperable and, in the worst case, unstable, potentially putting operators, bystanders and property in danger.
Attempts have been made to close this information gap and give UAVs alternative ways to navigate their environments without GPS. But those attempts have resulted in further information gaps, especially on UAVs whose speeds can outpace the capabilities of their onboard technologies. For instance, scanning LiDAR routinely fails to achieve its location-matching with accuracy when the UAV is flying through environments that lack buildings, trees and other orienting structures.
To address these drawbacks, a team from Draper and MIT has developed advanced vision-aided navigation techniques for UAVs that do not rely on external infrastructure, such as GPS, detailed maps of the environment or motion capture systems. Working together under a contract with the Defense Advanced Research Projects Agency (DARPA), Draper and MIT created a UAV that can autonomously sense and maneuver through unknown environments without external communications or GPS under the Fast Lightweight Autonomy (FLA) program. The team developed and implemented unique sensor and algorithm configurations, and has conducted time-trials and performance evaluations in indoor and outdoor venues.
They fused vision and inertial navigation system that combines the advantages of both sensing approaches and accumulates error more slowly over time than either technique on its own, producing a full position, attitude and velocity state estimate throughout the vehicle trajectory. The result is a navigation solution that enables a UAV to retain all six degrees of freedom and allows it to fly autonomously without the use of GPS or any communication with vehicle speeds of up to 45 miles per hour.
“The biggest challenge with unmanned aerial vehicles is balancing power, flight time and capability due to the weight of the technology required to power the UAVs,” said Robert Truax, Senior Member of Technical Staff at Draper. “What makes the Draper and MIT team’s approach so valuable is finding the sweet spot of a small size, weight and power for an air vehicle with limited onboard computing power to perform a complex mission completely autonomously.”
Draper and MIT’s sensor- and camera-loaded UAV was tested in a number of environments ranging between cluttered warehouses and mixed open and tree filled outdoor environments with speeds up to 10 m/s in cluttered areas and 20 m/s in open areas. The UAV’s missions were composed of many challenging elements, including tree dodging followed by building entry and exit and long traverses to find a building entry point, all while maintaining precise position estimates.
Small, lightweight flight vehicles, such as consumer-grade quadrotors, are becoming increasingly common. These vehicles’ on-board state estimators are typically reliant upon frequent and accurate updates from external systems such as the Global Positioning System (GPS) to provide state estimates required for stable flight. However, in many cases GPS signals may be unavailable or unreliable, and loss of GPS can cause these vehicles to go unstable or crash, potentially putting operators, bystanders, and property in danger. Thus reliance on GPS severely limits the robustness and operational capabilities of lightweight flight vehicles. This paper introduces the Smoothing And Mapping With In-ertial State Estimation (SAMWISE) navigation system. SAM-WISE is a vision-aided inertial navigation system capable of providing high-rate, low-latency state estimates to enable high-dynamic flight through obstacle-laden unmapped indoor and outdoor environments. SAMWISE offers a flexible framework for inertial navigation with nonlinear measurements, such as those produced by visual feature trackers, by utilizing an incremental smoother to efficiently optimize a set of nonlinear measurement constraints, estimating the vehicle trajectory in a sliding window in real-time with a slight processing delay. To overcome this delay and consistently produce state estimates at the high rates necessary for agile flight, we propose a novel formulation in which the smoother runs in a background thread while a low-latency inertial strapdown propagator outputs position, attitude, and velocity estimates at high-rate. We additionally propose a novel measurement buffering approach to seamlessly handle delayed measurements, measurements produced at inconsistent rates, and sensor data requiring significant processing time, such as camera imagery. We present experimental results high-speed flight with a fully autonomous quadrotor using SAMWISE for closed-loop state estimation from flight demonstrations during the DARPA Fast Lightweight Autonomy (FLA) program in April and November of 2016. SAMWISE achieved less than 1% position error and up to 5.5 m/s (12 mph) flight in a simulated indoor warehouse environment using a scanning-lidar, inertial measurement unit, and laser altimeter during the first FLA milestone event in April 2016. In November 2016, SAMWISE achieved approximately 3% error and up to 20 m/s (45 mph) flight in an open outdoor environment with large obstacles during the second FLA milestone event. The results of these flight tests demonstrate that our navigation system works robustly at high speed across multiple distinct environments.