Eprint already available on another site (E-prints, Working papers and Research blog)
Hybrid model for word prediction using naive bayes and latent information
Goulart, Henrique X.; Dalle Lucca Tosi, Mauro; Goncalves, Daniel et al.


Full Text
Publisher postprint (653.47 kB)

All documents in ORBilu are protected by a user license.

Send to


Abstract :
[en] Historically, the Natural Language Processing area has been given too much attention by many researchers. One of the main motivations beyond this interest is related to the word prediction problem, which states that given a set of words in a sentence, one can recommend the next word. In literature, this problem is solved by methods based on syntactic or semantic analysis. Solely, each of these analyses cannot achieve practical results for end-user applications. For instance, the Latent Semantic Analysis can handle semantic features of text, but cannot suggest words considering syntactical rules [1]. On the other hand, there are models that treat both methods together and achieve state-of-the-art results, e.g. Deep Learning. These models can demand high computational effort, which can make the model infeasible for certain types of applications. With the advance of the technology and mathematical models, it is possible to develop faster systems with more accuracy. This work proposes a hybrid word suggestion model, based on Naive Bayes and Latent Semantic Analysis, considering neighboring words around unfilled gaps. Results show that this model could achieve 44.2% of accuracy in the MSR Sentence Completion Challenge.
Disciplines :
Computer science
Author, co-author :
Goulart, Henrique X.
Dalle Lucca Tosi, Mauro  ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
Goncalves, Daniel
Maia, Rodrigo F.
Wachs-Lopes, Guilherme A.
Language :
Title :
Hybrid model for word prediction using naive bayes and latent information
Publication date :
Available on ORBilu :
since 06 September 2022


Number of views
81 (7 by Unilu)
Number of downloads
13 (0 by Unilu)


Similar publications

Contact ORBilu