Text CAPTCHA Traversal via Knowledge Distillation of Convolutional Neural Networks: Exploring the Impact of Color Channels Selection

While most of the existing works have investigated the recognition of a fixed-length CAPTCHA, the authors of the article propose to apply knowledge distillation to approximate the predictions of recurrent convolutional neural networks. Such models have proven themselves well in predicting the dynami...

Full description

Saved in:
Bibliographic Details
Published inRecent Trends in Analysis of Images, Social Networks and Texts Vol. 1573; pp. 111 - 122
Main Authors Terekhov, Valery, Chernenky, Valery, Ishkov, Denis
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2022
Springer International Publishing
SeriesCommunications in Computer and Information Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:While most of the existing works have investigated the recognition of a fixed-length CAPTCHA, the authors of the article propose to apply knowledge distillation to approximate the predictions of recurrent convolutional neural networks. Such models have proven themselves well in predicting the dynamic length of characters in images. The paper studies the influence of the size and complexity of the training sample on the quality of recognition. The authors studied the effect of individual color channels and their linear combination on the final quality of models. An estimate of the importance of each color channel was acquired using the trainable scalar coefficients in a linear combination. The results obtained made it possible to reduce the size of the input data for recognition without loss in quality of recognition, as well as speed up the training of the model. The analysis of model errors allowed us to offer suggestions for improving ways of countering automatic recognition.
ISBN:9783031151675
3031151674
ISSN:1865-0929
1865-0937
DOI:10.1007/978-3-031-15168-2_10