Issue 1 (215), article 2

DOI:https://doi.org/10.15407/kvt215.01.020

Cybernetics and Computer Engineering, 2024,1(215)

Dzhebrailov R.Yu., PhD Student,
Junior Researcher of the Research Laboratory of Unmanned Complexes and Systems,
https://orcid.org/0000-0002-4473-9670,
e-mail: rombik1197@gmail.com

Gospodarchuk O.Yu.,
Senior Researcher of the Intelligent Control Department
https://orcid.org/0000-0001-6619-2277,
e-mail: olexago@gmail.com

International Research and Training Center
for Information Technologies and Systems
of the National Academy of Science of Ukraine
and the Ministry of Education and Science of Ukraine.
40, Acad. Glushkov av., 03187, Kyiv, Ukraine

DETECTION OF SPECIAL ZONES AS A BASIS FOR THE METHOD OF TOPOGRAPHIC AFFINITY OF IMAGES

Introduction. The satellite and inertial navigation systems of an unmanned aerial vehicle (UAV) or unmanned aircraft system (UAS) have their drawbacks. Attempts to eliminate these shortcomings are to develop an autonomous navigation system. The officially patented model of an autonomous navigation system, as it turned out, also has its drawbacks. Accordingly, there is a need to improve such an autonomous navigation system.

The purpose of the paper is to develop and study a method for determining the topographic affinity of images based on the detected special zones in images of the natural landscape for autonomous UAV navigation.

Results. A method of topographic affinity of visual images based on the detection of special zones by searching for local maxima of the Laplace operator in the image has been developed. The method of topographic affinity of images allows  involving a smaller number of special points for comparison, which reduces the amount of required memory resources and increases performance.

Conclusions. The proposed method of topographic affinity of images based on the detection of special zones (blob detection methods) based on the principle of searching for local maxima of the Laplace operator can be used to build an autonomous navigation system for UAVs. The algorithmic implementation of the method has shown that it can work with a large number of complex and diverse images of the earth’s surface obtained during UAV flights, is effective by increasing the processing speed of the studied images, and can be implemented to create full-fledged UAV autonomous navigation systems.

Keywords: unmanned aerial vehicle, unmanned aerial vehicle complex, autonomous navigation, special points, special zones, method of special image zones analysis.

Download full text!

REFERENCES

1 Irani G., Christ J. Image processing for Tomahawk scene matching. Johns Hopkins APL Technical Digest. 1994, No, 3, pp. 250-264.

2 Durrant-Whyte H., Bailey T. Simultaneous localization and mapping: part I. IEEE Robotics and Automation Magazine. 2006, No. 2, pp. 99-110.
https://doi.org/10.1109/MRA.2006.1638022

3 Durrant-Whyte H., Bailey T. Simultaneous localization and mapping (SLAM): part II. IEEE Robotics and Automation Magazine. 2006, No. 3, pp. 108-117.
https://doi.org/10.1109/MRA.2006.1678144

4 SVO: Semi-Direct Visual Odometry for Monocular and Multi-Camera Systems / C. Forster et al. IEEE Transactions on Robotics. 2017, Vol. 33, no. 2, pp. 249-265.
https://doi.org/10.1109/TRO.2016.2623335

5 Scaramuzza D., Fraundorfer F. Visual odometry [tutorial]. Part I: The first 30 years and fundamentals. IEEE Robotics and Automation Magazine. 2011, No. 4, pp. 80-92.
https://doi.org/10.1109/MRA.2011.943233

6 Scaramuzza D., Fraundorfer F. Visual odometry: part II: matching, robustness, optimization, and applications. IEEE Robotics and Automation Magazine. 2012, No. 2, pp. 78-90.
https://doi.org/10.1109/MRA.2012.2182810

7 Efficient on-board Stereo SLAM through constrained-covisibility strategies / G. Castro et al. Robotics and Autonomous Systems. 2019, No. 116, pp. 192-205.
https://doi.org/10.1016/j.robot.2019.03.015

8 Qin T., Li P., Shen S. VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics. 2017, No. 4, pp. 1004-1020.
https://doi.org/10.1109/TRO.2018.2853729

9 VIMO: simultaneous visual inertial model-based odometry and force estimation / B. Nisar et al. IEEE Robotics and Automation Letters. 2019, No. 3, pp. 2785-2792.
https://doi.org/10.1109/LRA.2019.2918689

10 Autonomous navigation system for unmanned aerial vehicle based on topographic clustering of visual images : pat. UA 121833 C2 Ukraine: 2020.01. № a 2019 05904 ; application for application filed 29.05.2019 ; published 27.07.2020, Bulletin No. 14. 18 p.

11 Intelligent information technology of autonomous navigation of an unmanned aerial vehicle / O. Volkov et al. Actual problems of automation and instrumentation: materials of the 2nd International scientific and technical conference, Kharkiv, December 6. 2018. 2018, pp. 18-19. [In Ukrainian]

12 Methodological recommendations “Combating unmanned aerial vehicles” (based on the experience of the Joint Forces Operation (formerly ATO). Kyiv: Center for Operational Standards and Methods of Preparation. Arms. Forces of Ukraine in cooperation with the Main Directorate of Training. Armed Forces of Ukraine. Armed Forces of Ukraine, 2019. 44 p.[In Ukrainian]

13 Gerasimov S., Kolomiytsev O., Pustovarov V. Features of determining the measurement accuracy of inertial coordinate determination devices. Control, navigation and communication systems. 2018, № 6(52), pp. 3-8. [In Ukrainian]

14 Zarubin A., Astin V. Analysis of total instrumental errors of inertial navigation systems. Systems of weapons and military equipment. 2006, № 1, pp. 68-71.[In Ukrainian]

15 Han K. T. M., Uyyanonvara B. A Survey of Blob Detection Algorithms for Biomedical Image. 2016 7th International Conference of Information and Communication Technology for Embedded Systems (IC-ICTES), pp. 57-60.
https://doi.org/10.1109/ICTEmSys.2016.7467122

16 Zavadil J., Tuma J., Santos V. Traffic signs detection using blob analysis and pattern recognition. Proceedings of the 13th International Carpathian Control Conference (ICCC), High Tatras, 28-31 May 2012. 2012, pp. 776-779.
https://doi.org/10.1109/CarpathianCC.2012.6228752

Received 19.12.2023