NLP Research Based on Transformer Model
- Resource Type
- Conference
- Authors
- Wu, Junjie; Huang, Xueting; Liu, Jingnian; Huo, Yingzi; Yuan, Gaojing; Zhang, Ronglin
- Source
- 2023 IEEE 10th International Conference on Cyber Security and Cloud Computing (CSCloud)/2023 IEEE 9th International Conference on Edge Computing and Scalable Cloud (EdgeCom) CSCLOUD-EDGECOM Cyber Security and Cloud Computing (CSCloud)/2023 IEEE 9th International Conference on Edge Computing and Scalable Cloud (EdgeCom), 2023 IEEE 10th International Conference on. :343-348 Jul, 2023
- Subject
- Communication, Networking and Broadcast Technologies
Computing and Processing
Deep learning
Cloud computing
Computational modeling
Transformers
Natural language processing
Production facilities
History
Transformer
NLP
RNN
Transformer XL
- Language
- ISSN
- 2693-8928
Natural language processing technology is an important research area in artificial intelligence which occupies a pivotal position in deep learning. This paper describes in detail the research of NLP based on Transformer structure, thus showing its ultra-high performance and development prospects. Therefore, this article provides a detailed description of the research on NLP based on the Transformer structure, in order to demonstrate its ultra-high performance and development prospects.