Study of the efficiency of detecting and recognizing drone images from a video stream
DOI:
https://doi.org/10.30837/rt.2020.3.202.14Keywords:
research, efficiency, detection, recognition, image, drone, video stream.Abstract
The authors have developed and experimentally tested an algorithm for processing a video stream of a stationary video camera. It consists of the stages of detecting moving objects and classifying these objects using a neural network. To detect moving objects, the methods of identifying moving objects against a stationary background and analyzing the history of motion were used. Based on the experimental data, the effectiveness of using the models of the background images of MOG, MOG2, KNN, GMG, CNT, GSOC, LSBP for solving the problem was analyzed. Recommendations for the choice of the parameters of these models were formulated. The selection criteria were as follows: high performance and low noise. Models of fully connected and convolutional neural networks were created and trained making it possible to classify 12 types of moving objects. Sets of images were created to train neural networks: drones, fragments of tree foliage, grass, clouds and insects. Based on the results of training and testing networks, recommendations are given for the number of network layers, the number of neurons in a layer, the number of convolutions to achieve maximum performance and recognition accuracy. Comparative analysis of the accuracy of drone classification using fully connected and convolutional networks when processing experimental data has proven the effectiveness of using convolutional networks. The dependence of the drone detection accuracy on the image size and, accordingly, on the distance to this drone is plotted.References
https://dronerules.eu/en/recreational/regulations
BBC (2018) Charges over drone drug smuggling into prisons. https://www.bbc.com/news/uk-england-43413134.
Jean-Paul Yaacoub, Hassan Noura, Ola Salman, Ali Chehab. Security analysis of drones systems // Attacks, limitations, and recommendations: Internet of Things. Volume 11, pages 1-39, 2020.
Kartashov V. M., Oleynikov V. N., Sheyko S. A., Koryttsev I. V., Babkin S.I., Zubkov O.V. Peculiarities of small-sized unmanned aerial vehicles detection and recognition // Telecommunications and Radio Engineering. Volume 78, Issue 9, pages 771-781, 2019.
Deep Learning on Multi Sensor Data for Counter UAV Applications – A Systematic Review. Stamatios Samaras, Eleni Diamantidou, Dimitrios Ataloglou, Nikos Sakellariou, Anastasios Vafeiadis, Vasilis Magoulianitis, Antonios Lalas, Anastasios Dimou, Dimitrios Zarpalas, Konstantinos Votis, Petros Daras, Dimitrios Tzovaras. Sensors (Basel) 2019 Nov; 19(22): 4837. Published online 2019. Nov 6. doi: 10.3390/s19224837.
Redmon J., Divvala S., Girshick R. and Farhadi A. You Only Look Once: Unified, Real-Time Object Detection // 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, pages 779-788, 2016, doi: 10.1109/CVPR.2016.91.
Hao Liu1, Fangchao Qu1, Yingjian Liu1, Wei Zhao1,Yitong Chen A drone detection with aircraft classification based on a camera array IOP // Conf. Series: Materials Science and Engineering 322 (2018).
Ammar A., Koubaa A., Ahmed M., Saad A. Aerial Images Processing for Car Detection using Convolutional Neural Networks: Comparison between Faster R-CNN and YoloV3. Preprints 2019, 2019100195 (doi: 10.20944/preprints201910.0195.v1).
Song Han, W. Shen, Z. Liu Deep Drone: Object Detection and Tracking for Smart Drones on Embedded System // Computer Science, 2016.
Wen Shao, Rei Kawakami,Ryota Yoshihashi, Shaodi You, Hidemichi Kawase &Takeshi Naemura Cattle detection and counting in UAV images based on convolutional neural networks // International Journal of Remote Sensing Volume 41- NO 1, pages 31-52, 2020.
Unlu E., Zenou E., Riviere N. et al. Deep learning-based strategies for the detection and tracking of drones using several cameras // IPSJ T Comput Vis Appl Vol. 11, 2019.
Marcin Kociołeka, Michał Strzeleckia, Rafał Obuchowicz. Does image normalization and intensity resolution impact texture classification? // Computerized Medical Imaging and GraphicsVolume 81, 2020.
Tomasz Hyla, Natalia Wawrzyniak. Automatic Ship Detection on Inland Waters: Problems and a Preliminary Solution // ICONS 2019: The Fourteenth International Conference on Systems, pages 56-60.
T. Trnovszký, P. Sýkora, R. Hudec. Comparison of Background Subtraction Methods on Near Infra-Red Spectrum Video Sequences // Procedia Engineering Volume 192, pages 887-892, 2017.
Yao G., Lei T., Zhong J., Jiang, P., Jia, W. Comparative Evaluation of Background Subtraction Algorithms in Remote Scene Videos Captured by MWIR Sensors // Sensors 2017, Vol. 17, pages 19-45.
Marcomini L. A., Cunha A. L. A Comparison between Background Modelling Methods for Vehicle Segmentation in Highway Traffic Videos. // Computer Vision and Pattern Recognition, 2018.
Kartashov V., Oleynikov V., Zubkov O., Sheiko S. Optical detection of unmanned air vehicles on a video stream in a real-time // The Fourth International Conference on Information and Telecommunication Technologies and Radio Electronics (UkrMiCo’2019), 9–13 September 2019, Odessa, Ukraine, 4 p.
Moulay A. Akhloufi, Sebastien Arola, Alexandre Bonnet. Drones Chasing Drones: Reinforcement Learning and Deep Search Area Proposal. Drones, Vol. 58, No.3, 2019.
Yamashita R., Nishio M., Do R.K.G. et al. Convolutional neural networks: an overview and application in radiology // Insights Imaging Vol. 9, pages 611–629, 2018.
Vivienne Sze, Yu-Hsin Chen, Tien-Ju Yang, Joel Emer Efficient Processing of Deep Neural Networks: A Tutorial and Survey Computer Science, 2017.
Downloads
How to Cite
Issue
Section
License
Authors who publish with this journal agree to the following terms:
1. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).