Variational Auto-Encoder for text generation
- Resource Type
- Conference
- Authors
- Hu, Haojin; Liao, Mengfan; Mao, Weiming; Liu, Wei; Zhang, Chao; Jing, Yanmei
- Source
- 2020 IEEE 5th Information Technology and Mechatronics Engineering Conference (ITOEC) Technology and Mechatronics Engineering Conference (ITOEC), 2020 IEEE 5th Information. :595-598 Jun, 2020
- Subject
- Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Robotics and Control Systems
Transportation
Recurrent neural networks
Computational modeling
Decoding
Semantics
Machine learning
Maximum likelihood estimation
Task analysis
variational auto-encoder
recurrent neural network
text generation
- Language
Many different methods to text generation have been introduced in the past. Recurrent neural network language(RNNLM) is powerful and scalable for text generation in unsupervised generative modeling. We extended the RNNLM and propose the Variational Auto-Encoder Recurrent Neural Network(VAE-RNNLM), which designed to explicitly capture such global features as continuous latent variable. Maximum likelihood learning in such a model presents an intractable inference problem. VAE-RNNLM circumvents these difficulties by using the architecture of the latest advance in variational inference to introduce a practical training technique for powerful neural network generative models with latent variables. In this paper, we using VAE-RNNLM for text generation and achieved good performance.