Context-Based Narrative Generation Transformer (NGen-Transformer)
- Resource Type
- Conference
- Authors
- Raza Samar, Abraar; Khan, Bostan; Mumtaz, Adeel
- Source
- 2022 19th International Bhurban Conference on Applied Sciences and Technology (IBCAST) Applied Sciences and Technology (IBCAST), 2022 19th International Bhurban Conference on. :256-261 Aug, 2022
- Subject
- Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Fields, Waves and Electromagnetics
Robotics and Control Systems
Signal Processing and Analysis
Measurement
Transformers
Natural language processing
Task analysis
Context modeling
Story Generation
Narrative Generation
Deep Learning
Transformer
- Language
- ISSN
- 2151-1411
Text generation is an important domain of natural language processing where the plausibility of the generated text depends upon the context assimilation capabilities of the architecture being used. Recently the performance of automatic text generation task have greatly improved with the use of attention based language models. In this paper, we have explored the task of story generation based on some user defined context or prompt. We have proposed a GP2 based narrative generation architecture called NGen-Transformer. Our proposed architectures focuses specifically on the context provided by the user to produce meaningful stories. For the purpose of evaluation of our proposed model, we have used the WritingPrompts dataset which consists of a large number of human written sample stories based on corresponding titles or sentences (prompts). Experimental results show that our proposed NGen- Transformer model outperforms several sequence to sequence as well as attention based architectures on the task of story generation.