Video image-based air refueling auxiliary cohesion method

The invention discloses a video image-based air refueling auxiliary cohesion method which comprises the following steps: A, obtaining sequence image data: continuously acquiring original floating anchor sequence images by a receiver aircraft through a digital camera; B, processing each of the origin...

Full description

Saved in:
Bibliographic Details
Main Author ZHENG SHUNYI
Format Patent
LanguageEnglish
Published 27.05.2015
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The invention discloses a video image-based air refueling auxiliary cohesion method which comprises the following steps: A, obtaining sequence image data: continuously acquiring original floating anchor sequence images by a receiver aircraft through a digital camera; B, processing each of the original floating anchor images as follows: B1, establishing Gaussian pyramid images for the acquired original floating anchor sequence images; B2, carrying out quick image segmentation by adopting a toboggan algorithm on the Gaussian pyramid images to obtain pattern spots; B3, carrying out index evaluation on the obtained pattern spots to generate index charts; B4, integrating the different-scale index charts generated by the layers in the Gaussian pyramid images so as to generate a multi-scale comprehensive index chart; B5, recognizing a floating anchor oil outlet area in the multi-scale comprehensive index chart; B6, carrying out accurate floating anchor spatial position calculation according to the recognized floating anchor oil outlet area; B7, outputting the floating anchor spatial position in real time so as to guide pilots to complete the butt joint of the oil reception pipe and the floating anchor. The video image-based air refueling auxiliary cohesion method provided by the invention can be used for realizing the real-time detection of floating anchor spatial position and is high in reliability; judgement can be carried out by measuring the floating anchor spatial position in real time.
Bibliography:Application Number: CN20131590732