Framework for Autonomous On-board Navigation with the AR
Peter Wei
Mr. Kemp
AE Research
2018/10/5
Article Review 2
Framework for Autonomous On-board Navigation with the AR.Drone was published on Springer Science+Business Media Dordrecht in late 2013. The authors of the paper are Jacobo Jiménez Lugo and Andreas Zell
This article is mainly about the approach the author took to utilize the AR.Drone from a French company and build a framework for autonomous flying. The approach does not need any external ground station for data analysis or image processing, the whole system works independent and fully onboard. For getting enough processing power, they decide to use an external processing unit that steers the vehicle to a desired location. Furthermore, they have compared three different systems for autonomous flying by letting them follow several trajectories and evaluated the performance of the systems. This paper is connected to my research because it introduces a possible way that I can use to construct my research by presenting this low-cost, reliable, and easy-to-use test framework. And they have proven that the platform can be used to accomplish variety of different complex tasks such as trajectory tracking and object recognition.
The research problem is clearly defined as presenting a framework that allows the low-cost quadrotor AR.Drone to equip additional processing power to perform autonomous flights, position estimating and person following. The results and potential use of the results of the research shows me the possibility to utilize the AR.Drone in my research development and idea forming. And now I have a option that I can choose besides building the test platform and framework all by myself and completely from scratch. The article basically shows me an approach that I have not considered carefully before.
The paper separates methodology from analysis by firstly discusses about what are the hardwares that are already included in the AR.Drone (both version 1.0 and 2.0),
“One board, the motherboard, contains the main processing unit (an ARM9 processor running at 468 MHz on AR.Drone 1.0 and an ARM Cortex-A8 at 1 GHz on AR.Drone 2.0), two cameras, a Wi-Fi module and a connector for software flashing and debugging. The cameras have different orientations. One is oriented vertically, pointing to the ground. It has an opening angle of 63◦, a frame rate of 60 frames per second (fps) and a resolution of 320 × 240 pixels. This camera is used to estimate the horizontal speed of the vehicle. The second camera is pointing forwards. It has an opening angle of 93◦, a frame rate of 15 fps and a resolution of 640 × 480 pixels on version 1.0. Version 2.0 has an improved front camera that can deliver high definition images at 720p resolution at 30 fps . Both cameras can be used for detection of markers such as stickers, caps or hulls of other AR.Drones.”
As well as the external processor that they managed to connect to the flight controller and make them work together.
Besides the hardware part, the paper also talks about the software level. The AR.Drone can give a lot of sensor information to the client. These information is then separated into two sorts of streams: a “navdata”(?) stream that containing information identified with the condition of the vehicle and a video stream that gives encoded video from the cameras. The route information incorporates the status of the vehicle, motors and communications, as well as “raw and filtered IMU measurements”, attitude (move, pitch, yaw), height, linear velocities and a situation regarding take off point calculated from visual data. The route information likewise incorporates data from the discovery of visual labels. The information is subdivided into a few alternatives that gathering distinctive sensor information. The user can request the options that contain sensor data of interest and receive them periodically. The received video can be from both of the two cameras or a photo in-picture video with one of the camera pictures superposed on the upper left corner of the other one. The ARM processor runs an embedded Linux operating system that simultaneously manages the wireless communications, visual-inertial state estimation and control algorithms.
For navigation algorithms, the AR.Drone provides sophisticated navigation algorithms and assistance in maneuvers such as take off, landing and hovering-in-position to ensure user’s safety. To achieve a stable hovering and position control, the AR.Drone estimates its horizontal velocity using its vertical camera. Two different algorithms are used to estimate the horizontal velocity. One tracks local interest points over different frames and calculates the velocity from the displacement of these points.The second algorithm estimates the horizontal speed by computing the optical flow on pyramidal images. It is also the default algorithm during the flight while it is less precise but more robust since it does not rely on highly textured or high-contrast scenes.
For basic controls: “The innermost loop controls the attitude of the vehicle using a PID controller to compute a desired angular rate based on the difference between the current estimate of the attitude and the attitude set point defined by the user controls. The second loop uses a proportional controller to drive the motors. When the controls are released, the AR.Drone computes a trajectory that will take it to zero velocity and zero attitude in a short time. This technique is designed off-line and uses feedforward control over the inverted dynamics of the quadrotor.”(p. 404, 2.2.2)
The paper also talks about their system architecture. Their system consists of two main components: an external processing unit and a proxy program that serves as a bridge between the AR.Drone and the external processing unit. They used three different devices as external processing unit. One is an ATMEL ATmega 1284 8-bit microcontroller. It runs at 14.7456 MHz, has 128kb of memory, 32 I/O pins, counters and timers with PWM capabilities as well as two USART ports(what’s that?) and a 10-bit analog to digital converter. It extends the capabilities of the AR.Drone to enable autonomous flying. The second unit is a Gumstix Overo Fire board. This single-board computer outfits an ARM Cortex-A8 processor running at 700 MHz with 250 MB of RAM. An additional expansion board provides access to USB ports and ethernet connections. With the expansion board the setup measures 105 × 40 mm and weights approximately 35 g.(quite light) The third unit is a Hardkernel ODROID-U2. It is a compact developing board with an ARM Cortex-A9 quad-core processor running at 1.7 GHz. Its dimensions are 48 × 52 mm and weights 40 g. It has 2 GB of RAM memory and a 64GB card of flash memory. It has two USB ports, a UART port and ethernet connection. It runs an Ubuntu Linux system. The last two boards are used to process images on-board. These systems provide the additional computational power required for high level navigation tasks without the need of a ground station or any remote control device, thus widening its application range and eliminating delays that may be caused by the communication with the base. They tested the three systems considering different scenarios. One system to autonomously fly predefined figures with with a microcontroller. The second uses computer vision to find three known markers and compute the vehicle’s pose. The third uses onboard vision to follow a person wearing on orange shirt.
Test 1: Autonomous Figure Flying
To test the capabilities of their system and analyze its performance, they performed a series of autonomous flights. They used an AR.Drone (Version 1.0) communicating with the microcontroller mentioned before. Flight data are logged on the base station that communicates with the microcontroller using a radio module. The flight experiments were performed in two different rooms, both equipped with a motion capture system that monitors the position of the vehicle at a frequency of 100 Hz with millimeter precision for accurate ground truth measurements. One environment was a small room where the motion capture system covers a volume of 3 × 3 × 2 m^3. The room has a carpet that provides texture for the estimation algorithms of the AR.Drone.The second room covers a larger volume of 10 × 12 × 6 m^2. This room lacks texture on the floor so they laid colored paper on the floor forming the figure of an eight to provide the texture. To test the trajectory following, the team defined way points within a 2×2 m^2 area of the small room arranged to form different patterns. Afterloading parameters for the flight to the microcontroller, it calculated the trajectory to follow. They then commanded the AR.Drone to hover at a static height (1.1m) and then start the autonomous flight. The AR.Drone flew three figures with varying level of difficulty: a square, a zig-zag and a spiral. In the larger room,the AR.Drone was commanded to fly a predefined figure-eight of 5 m length and 1.75 m width over the marked figure on the floor of the room while keeping a constant height of 1.10 m.
To evaluate the performance of our system, the team analyzed the data collected from 20 flights. They first synchronized the data logged on the base station with the data provided by the tracking system. The data from both systems is synchronized by noting the start of the experiment (movement along x-y axis) on both data logs, the end of the AR.Drone ’s log and the total time of the experiment from these data and interpolating the tracking system’s data at the time marked by the AR.Drone ’s data log timestamp. The team measured the accuracy of the AR.Drone pose estimation by comparing it with ground truth data. The root mean square error on the short trajectories was smaller than 5 cm and on the longer trajectory around 10 cm on x and y axes while on the yaw angle the RMSE was around 3 degrees and 10 degrees, respective to the reference. They thought the difference in error values can be caused by drift in the sensors, which translates to a small drift of the position estimation. And as expected, the height estimation from the ultrasonic range sensor yields comparable results in both experiments (around 1 cm). The errors and standard deviations in all axes showed that the AR.Drone ’s odometry provides enough accuracy for flying figures autonomously.
Test 2: Pose Estimation
The team utilized an AR.Drone 2.0 furnished with a Gumstix Overo Fire and a Pointgrey Firefly color camera as mentioned before. This test comprised in giving the vehicle a chance to drift in position at a fixed separate from the example made with the orange markers. They fixed the example at 110 cm tallness and order the vehicle to drift at a separation of 160 cm straightforwardly behind the example. The position estimation figured by the vehicle is recorded and contrasted and the separation between the example and the vehicle estimated by a movement catch framework with millimeter accuracy. The controllers for x, y and z hub are assessed by estimating the deviation from wanted float position over the flight. The yaw edge was not evaluated since it was designed to keep the pattern in the focal point of the picture along the flat. As the result, the overall position error was lower than 15 cm for the x and y axes and lower than 5 cm for the z axis. The data chart shows that this system is capable of holding the position relative to the pattern. And the position estimation provided by the system matches the ground truth data provided by the tracking system.
Test 3: Person Following
In this test, they finally used the Hardkernel ODROID-U2 chip together with the AR.Drone 2.0 and a Pointgrey Firefly color camera as described before. In this experiment a person wearing an orange shirt walks a 10 m straight line while the vehicle follows him at a distance of approximately 2 m in an outdoor setup. Once he reaches the end, he turns around and walks back to the starting point. During the whole flight, the path of the vehicle is estimated by its visual odometry. For the results, the vehicle follows the path of a line in the x direction and stays within 1 m when it starts going back and finishes close to the starting x position. The y direction is not controlled actively since the vehicle lacks the knowledge of this dimension and instead controls its yaw to keep the person in the center of the picture. The vehicle drifts from the starting y-position but stays in the desired range of 2 m following the person and finishes approximately around 2 m from the starting point.
The conclusion of the paper clearly states that they have “presented a framework that allows the low-cost quadrotor AR.Drone to equip additional processing power to perform autonomous flights, position estimation and person following.” and which is exactly what they did in this paper, so I think it is logically very related.
And I’m not 100 percent sure whether the paper has taken its own approach to the problem. But I think it they have their special contribution to the field’s topic. They are the first people who utilized the commercial-targeted AR.Drone as an usable research and test platform with enough processing power. The paper presented three different ways to construct the framework and they all have different amount of available processing power and they are designed to accomplish different tasks.
The main take-away I got from this paper is the idea of using the commercial-grade quadrotor as my test platform. And there are benefits at many point. Firstly, it is cost-effective, and light-weight. Secondly, it has a matured community where I can look for handful helps. Lastly, because of that it is a commercial product, the AR.Drone is very robust and easy to play with.(stable hovering, etc) and it also has its own API.
I will probably suggest people who are interested in my research to look at this article because if I’m going to use the AR.Drone as my test platform, all the reasons behind it are listed in here.
References
Jacobo Jiménez Lugo (2014, January 1b). Framework for Autonomous On-board Navigation with the AR.Drone. Retrieved October 10, 2018, from https://link.springer.com/article/10.1007/s10846-013-9969-5