Efficient Classification for Neural Machines Interpretations based on Mathematical models

Developing successful embedded vision applications necessitates a detailed review of various algorithmic optimization trade-offs and a wide variety of hardware design options. This makes it difficult for developers to navigate the solution space and find design points with the best performance trade...

Full description

Saved in:
Bibliographic Details
Published in2021 7th International Conference on Advanced Computing and Communication Systems (ICACCS) Vol. 1; pp. 2015 - 2020
Main Authors Sharma, D.K., Singh, Bhopendra, Regin, R., Steffi, R, Chakravarthi, M.Kalyan
Format Conference Proceeding
LanguageEnglish
Published IEEE 19.03.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Developing successful embedded vision applications necessitates a detailed review of various algorithmic optimization trade-offs and a wide variety of hardware design options. This makes it difficult for developers to navigate the solution space and find design points with the best performance trade-offs. In neural machine translation, large Transformer frameworks have produced state-of-the-art results and have become the industry standard. We will clarify the mathematical mechanisms behind the paper's efficient interpretation for neural machine translations and look for the best combination of known techniques to improve interpretation speed without compromising recognition accuracy in this paper. We perform an empirical study that compares various approaches and shows that using a hybrid of decoder self-attention replacement with simplified recurrent units, a deep encoder and shallow decoder architecture, and multiple head attention reseeding can achieve up to higher accuracy. By replacing heavy functions with lighter ones and enhancing the autoencoder's layers structure, excellent results can be achieved in a harmonious mix of time series, network architecture, and probabilistic solutions.
ISBN:9781665405201
1665405201
ISSN:2575-7288
DOI:10.1109/ICACCS51430.2021.9441718