Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
# Autonomous drone hunter operating by deep learning and all-onboard computations in GPS-denied environments Philippe Martin Wyder<sup>1*</sup>, Yan-Song Chen<sup>2</sup>, Adrian J. Lasrado<sup>1</sup>, Rafael J. Pelles<sup>1</sup>, Robert Kwiatkowski<sup>2</sup>, Edith O. A. Comas<sup>2</sup>, Richard Kennedy<sup>2</sup>, Arjun Mangla<sup>2</sup>, Zixi Huang<sup>3</sup>, Xiaotian Hu<sup>3</sup>, Zhiyao Xiong<sup>1</sup>, Tomer Aharoni<sup>2</sup>, Tzu-Chan Chuang<sup>2</sup>, Hod Lipson<sup>1</sup> <sup>1</sup>Department of Mechanical Engineering, Columbia University, New York, New York, USA <sup>2</sup>Department of Computer Science, Columbia University, New York, New York, USA <sup>3</sup>Department of Electrical Engineering, Columbia University, New York, New York, USA <sup>*</sup>Corresponding author E-mail: Philippe.Wyder@columbia.edu (PW) This work was supported in part by the following grants: - U.S. National Science Foundation National Robotics Initiative grant number 1527232 (M. A. Gore, R. J. Nelson, and H. Lipson). ## Paper Abstract: This paper proposes a UAV platform that autonomously detects, hunts, and takes down other small UAVs in GPS-denied environments. The platform detects, tracks, and follows another drone within its sensor range using a pre-trained machine learning model. We collect and generate a 58,647-image dataset and use it to train a Tiny YOLO detection algorithm. This algorithm combined with a simple visual-servoing approach was validated on a physical platform. Our platform was able to successfully track and follow a target drone at an estimated speed of 1.5 m/s. Performance was limited by the detection algorithm’s 77% accuracy in cluttered environments and the frame rate of eight frames per second along with the field of view of the camera. ## Data This repository provides all data necessary to reproduce our results reported in the paper, as well as the flight footage used to evaluate our platform's performance. Additionally, we added two drone-crash videos for educational purpopses and entertainment. ### ROS Environment Drone Hunter The ROS environment used on the hunter platform is available for download Here: [ProjectVenomROS-master.zip](https://osf.io/fs4hj/) ### Detection Algorithm Training Scripts This repository contains the resources to train Tiny-YOLO as we did for our drone hunter: [ProjectVenom_TrackingYOLO-master.zip](https://osf.io/b5zda/) ### Dataset The drone tracking data set used to train the drone detection algorithm was split into three files, so it could be uploaded to OSF. Ensure that you download all three files into the same directory before you unzip them. Then unzip the first one ending in ".001": this will create a single folder containing the dataset. [train_data_0402.7z.001](https://osf.io/hzstm/) [train_data_0402.7z.002](https://osf.io/c7qag/) [train_data_0402.7z.003](https://osf.io/g4dhx/) ### Flight Footage Each Hunting video contains two files. The files ending in "FPV.mp4" show the outputs of the ROS script during the flight, as well as the video downstream recorded by the on-board camera. The second the camera shows the drones flying from the perspective of the observer. #### Hunt #1 [Hunt-1-FPV.mp4](https://osf.io/xcf62/) [Hunt-1.mp4](https://osf.io/jqmk2/) #### Hunt #2 [Hunt-2-FPV.mp4](https://osf.io/jkxwm/) [Hunt-2.mp4](https://osf.io/q96sz/) ## Crashes ### Reversed Propeller Directions This educational video clarified the effects of mixing up propeller directions for the following years of research. [TheEffectOfReversePropellerDirection_20161027_035426_001.mp4](https://osf.io/478hk/) ### VIO error buildup due to lack of features The only test site available during our research was the squash court observed in our footage. The squash court was available to us for two hours at a time, within which we had to setup, fly, and pack up our experiments. One of the challenges we encountered was that the largely featureless space didn't allow the visual inertial odometry algorithm to find enough features to track. This lead to error buildup in the control system. The result of this error build up can be observed in this video: [GOPR9706.mp4](https://osf.io/bs26a/)
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.