NOVOTNÝ, Vít, Michal ŠTEFÁNIK, Eniafe Festus AYETIRAN, Petr SOJKA and Radim ŘEHŮŘEK. When FastText Pays Attention: Efficient Estimation of Word Representations using Constrained Positional Weighting. Journal of Universal Computer Science. New York, USA: J.UCS Consortium, 2022, vol. 28, No 2, p. 181-201. ISSN 0948-695X. Available from: https://dx.doi.org/10.3897/jucs.69619.
Other formats:   BibTeX LaTeX RIS
Basic information
Original name When FastText Pays Attention: Efficient Estimation of Word Representations using Constrained Positional Weighting
Authors NOVOTNÝ, Vít (203 Czech Republic, guarantor, belonging to the institution), Michal ŠTEFÁNIK (703 Slovakia, belonging to the institution), Eniafe Festus AYETIRAN (566 Nigeria, belonging to the institution), Petr SOJKA (203 Czech Republic, belonging to the institution) and Radim ŘEHŮŘEK (203 Czech Republic).
Edition Journal of Universal Computer Science, New York, USA, J.UCS Consortium, 2022, 0948-695X.
Other information
Original language English
Type of outcome Article in a journal
Field of Study 10201 Computer sciences, information science, bioinformatics
Country of publisher Czech Republic
Confidentiality degree is not subject to a state or trade secret
WWW preprint DOI
Impact factor Impact factor: 1.000
RIV identification code RIV/00216224:14330/22:00124923
Organization unit Faculty of Informatics
Doi http://dx.doi.org/10.3897/jucs.69619
UT WoS 000767374300005
Keywords in English Word embeddings; fastText; attention
Tags attention, language modeling, Positional embeddings, word embeddings
Tags International impact, Reviewed
Changed by Changed by: RNDr. Pavel Šmerk, Ph.D., učo 3880. Changed: 27/3/2023 16:59.
Abstract
In 2018, Mikolov et al. introduced the positional language model, which has characteristics of attention-based neural machine translation models and which achieved state-of-the-art performance on the intrinsic word analogy task. However, the positional model is not practically fast and it has never been evaluated on qualitative criteria or extrinsic tasks. We propose a constrained positional model, which adapts the sparse attention mechanism from neural machine translation to improve the speed of the positional model. We evaluate the positional and constrained positional models on three novel qualitative criteria and on language modeling. We show that the positional and constrained positional models contain interpretable information about the grammatical properties of words and outperform other shallow models on language modeling. We also show that our constrained model outperforms the positional model on language modeling and trains twice as fast.
Links
MUNI/A/1195/2021, interní kód MUName: Aplikovaný výzkum v oblastech vyhledávání, analýz a vizualizací rozsáhlých dat, zpracování přirozeného jazyka a aplikované umělé inteligence
Investor: Masaryk University
PrintDisplayed: 26/4/2024 11:49