Autonomous navigation of micro aerial vehicles using high-rate and low-cost sensors
A. Santamaria-Navarro, G. Loianno, J. Solà, V. Kumar and J. Andrade-Cetto
Autonomous Robots, 42(6): 1263-1280, 2018

The combination of visual and inertial sen- sors for state estimation has recently found wide echo in the robotics community, especially in the aerial robotics field, due to the lightweight and complementary char- acteristics of the sensors data. However, most state es- timation systems based on visual-inertial sensing suf- fer from severe processor requirements, which in many cases make them impractical. In this paper, we pro- pose a simple, low-cost and high rate method for state estimation enabling autonomous flight of Micro Aerial Vehicles (MAVs), which presents a low computational burden. The proposed state estimator fuses observa- tions from an inertial measurement unit, an optical flow smart camera and a time-of-flight range sensor. The smart camera provides optical flow measurements up to a rate of 200 Hz, avoiding the computational bot- tleneck to the main processor produced by all image processing requirements. To the best of our knowledge, this is the first example of extending the use of these smart cameras from hovering-like motions to odometry estimation, producing estimates that are usable during flight times of several minutes. In order to validate and defend the simplest algorithmic solution, we investigate the performances of two Kalman filters, in the extended and error-state flavors, alongside with a large number of algorithm modifications defended in earlier literature on visual-inertial odometry, showing that their impact on filter performance is minimal. To close the control loop, a non-linear controller operating in the special Euclidean group S E (3) is able to drive, based on the estimated vehicle’s state, a quadrotor platform in 3D space guaranteeing the asymptotic stability of 3D posi- tion and heading. All the estimation and control tasks are solved on board and in real time on a limited com- putational unit. The proposed approach is validated through sim- ulations and experimental results, which include com- parisons with ground-truth data provided by a motion capture system. For the benefit of the community, we make the source code public.