syllable-level long short-term memory recurrent neural network-based language model for korean voice interface in intelligent personal assistants








Abstract

This study proposes a syllable-level long short-term memory (LSTM) recurrent neural network (RNN)-based language model for a Korean voice interface in intelligent personal assistants (IPAs). Most Korean voice interfaces in IPAs use wordlevel n-gram language models. Such models suffer from the following two problems: 1) the syntax information in a longer word history is limited because of the limitation of n and 2) The out-of-vocabulary (OOV) problem can occur in a wordbased vocabulary. To solve the ?rst problem, the proposed model uses an LSTM RNN-based language model because an LSTM RNN provides long-term dependency information. To solve the second problem, the proposed model is trained with a syllableleveltextcorpus.Koreanwordscomprisesyllables,andtherefore, OOV words are not presented in a syllable-based lexicon. In experiments, the RNN-based language model and the proposed model achieved perplexity (PPL) of 68.74 and 17.81, respectively.


Modules


Algorithms


Software And Hardware

• Hardware: Processor: i3 ,i5 RAM: 4GB Hard disk: 16 GB • Software: operating System : Windws2000/XP/7/8/10 Anaconda,jupyter,spyder,flask Frontend :-python Backend:- MYSQL