DaG LLM ver 1.0: Pioneering Instruction-Tuned Language Modeling for Korean NLP
- Resource Type
- Working Paper
- Authors
- Jang, Dongjun; Lee, Sangah; Byun, Sungjoo; Kim, Jinwoong; Seo, Jean; Kim, Minseok; Kim, Soyeon; Oh, Chaeyoung; Kim, Jaeyoon; Jo, Hyemi; Shin, Hyopil
- Source
- Subject
- Computer Science - Computation and Language
- Language
This paper presents the DaG LLM (David and Goliath Large Language Model), a language model specialized for Korean and fine-tuned through Instruction Tuning across 41 tasks within 13 distinct categories.