Computer vision, navigation, sensor fusion, software-defined radio and big data -- in your pocket. We'll discuss the latest innovations in low-power, high-bandwidth CPU, GPU, and hybrid processor technology and algorithms which make the best use of these new platforms.
Robust velocity and position estimation at high update rates is crucial for mobile robot navigation. In recent years optical flow sensors based on computer mouse hardware chips have been shown to perform well on micro air vehicles. Since they require more light than present in typical indoor and outdoor low-light conditions, their practical use is limited. We present an open source and open hardware design 1 of an optical flow sensor based on a machine vision CMOS image sensor for indoor and outdoor applications with very high light sensitivity. Optical flow is estimated on an ARM CortexM4 microcontroller in real-time at 250 Hz update rate. Angular rate compensation with a gyroscope and distance scaling using a ultrasonic sensor are performed onboard. The system is designed for further extension and adaption and shown in-flight on a micro air vehicle. View full abstract»
The DelFly Explorer is the first flapping wing Micro Air Vehicle (MAV) that is able to fly with complete autonomy in unknown environments. Weighing just 20 grams and with a wingspan of 28cm, it is equipped with an onboard stereo vision system. The DelFly Explorer can perform an autonomous take-off, keep its height, and avoid obstacles for as long as its battery lasts (~9 minutes). All sensing and processing is performed on board, so no human or offboard computer is in the loop.
Inspired by flying animals, flapping-wing MAVs are highly manoeuvrable, able to quickly transition between multiple flight regimes (such as between hover and forward flying), and are robust to collisions. Their low weight and unobtrusive appearance, as well their ability to fly at low speeds and operate quietly, make them more suitable for use indoors or in the presence of humans than many other aerial platforms. There are many future applications, such as the detection of ripe fruit in green houses, for which flapping wing MAVs would need to fly without human intervention.
However, designing flapping-wing MAVs that are capable of autonomous flight is challenging, because of their small scale and extremely limited payload capabilities. Therefore, previous work in this area either focused on sub-tasks (such as led-following) or out-sourced parts of the sensing and control to external cameras and/or computers.
We achieved the fully autonomous flight of the DelFly Explorer by resolving the following four main challenges:
Onboard sensing/processing: We have developed a 4-gram onboard stereo vision system (2 cameras + processor) and 1-gram onboard autopilot with processor, barometer, accelerometers and gyros.
Vision algorithms: We developed a new approach to purposive vision, in which the vision algorithms make use of sub-sampling and extract only as many image samples as necessary for subsequent control. As a result, all vision algorithms run at frame rate. Specifically, we designed efficient vision algorithms that can cope with the absence of visual texture, as often happens in indoor environments (see the empty walls in the lecture room as an example). The algorithms also deal with the distortions that are introduced by the combination of the flapping motion and the rolling shutter cameras.
Control algorithms: We devised a control algorithm that ensures obstacle-free flight by construction. It takes into account the fact that the DelFly Explorer cannot hover in place by always ensuring an obstacle-free region in which the DelFly can turn around. The height control is based on the onboard barometer.
Payload capability: Compared to its predecessor (the 16-gram, 28cm wingspan flapping-wing DelFly II), the Explorer’s payload capability was increased enough to carry the stereo vision system and autopilot. This was achieved both by a redesign of the wings and by a reduction in the number of coils in the brushless motors.
We have performed experiments in various indoor spaces, ranging from lecture rooms to office rooms and lab spaces. Below, you see a part of an autonomous flight in a lecture room at TU Delft. The image has been made by retaining the motion regions every 10 frames in the experimental video. The original experimental video can be seen here.
The current algorithms allow collision-free flight, but do not yet form a complete solution to autonomous exploration. We are working to extend the DelFly’s exploration capabilities so that it can pass through open doors or windows, which it currently avoids.
For more information and videos, please visit the DelFly website.
Reference
“Autonomous Flight of a 20-gram Flapping Wing MAV with a 4-gram Onboard Stereo Vision System”, by C. De Wagter, S. Tijmons, B.D.W. Remes, and G.C.H.E. de Croon, (submitted).
Team
Christophe De Wagter Sjoerd Tijmons Bart Remes Guido de Croon