Introduction

Gimbal systems installed in drones for outdoor operations are expensive and require IMU and Brushless motor wich are prone to suffer from accumulative drift over long-term operation, this issue increases when those systems are operated in harsh weather conditions. In order to overcome these constrains, a CV based new tracking and fusion algorithm dedicated for gimbal system on drones is proposed. Main contributions are (a) a ligth-weight Resnet-18 backbone based network model was trained from scratch to segment images into binary parts (ground and sky), (b) geometric primitives tracking of the skyline and ground plane in 3D as cues, along with orientation estimation from IMU are use to provide multiple guesses for orientation, (c) spherical surface based adaptive particle sampling can fuse orientation from aforementioned sensor sources efficiently. The final prototype is evaluated on real-time embedded systems and with on ground functional tests.

Description:

The datasets used in the proposed algorithm testing process are presented in the following tables. Three type of tests were defined to evaluate single and multiple axes rotation.

  1. Roll Tests
  2. Pitch Tests
  3. Mixed Tests

The tests were planned to collected data at three different angular speeds (3, 9, and 15 deg/sec) and in each speed-case, two datasets were recorded (test-1 and test-2). The resulting data in each test is contained in a compressed rosbag, which are uploaded in the compressed rosbag column in the tables in the Datasets: section.

Platform:

Parts.png

The topics recorded in the rosbags are: