Paper
3 October 2022 Self-attention on RNN-based text classification
Dailun Li, Zeying Tian, Yining Duan
Author Affiliations +
Proceedings Volume 12290, International Conference on Computer Network Security and Software Engineering (CNSSE 2022); 1229005 (2022) https://doi.org/10.1117/12.2641031
Event: International Conference on Computer Network Security and Software Engineering (CNSSE 2022), 2022, Zhuhai, China
Abstract
Nowadays, many researchers claim that the self-attention mechanism generates a better performance regardless of circumstances. Having doubted such opinion, we examine different models with scalar dot-product self-attention mechanisms under various experimental settings to develop a comprehensive understanding of such technique. In conclusion, we have verified that the performance of the attention mechanism grows as the dataset input increases in size. In addition, we have observed that our attention layer can impact the model’s performance negatively on small datasets. Moreover, we demonstrate that the attention mechanism is model-dependent: opposite effects may be obtained on the same dataset with different model.
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Dailun Li, Zeying Tian, and Yining Duan "Self-attention on RNN-based text classification", Proc. SPIE 12290, International Conference on Computer Network Security and Software Engineering (CNSSE 2022), 1229005 (3 October 2022); https://doi.org/10.1117/12.2641031
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Performance modeling

Computer programming

Computer science

Computer engineering

Data hiding

Lithium

Back to Top