Aiming at the problem that the artificial tagging of Korean corpus is too time-consuming and laborious, and it is difficult for minority languages to integrate with various resources. We intend to construct an effective Korean structural representation from the perspective of representation learning to improve the effectiveness of subsequent natural language processing tasks. Combining deep reinforcement learning with Self-attention mechanism, we propose a hierarchical self-attention model (Hierarchically Structured Korean, HS-K). By using the Actor-Critic idea in reinforcement learning, the model takes the text classification effect as the label feedback of reinforcement learning, and transforms the text structure division task into the sequence decision task. The experimental results show that the model can identify the important text structure of Korean which closer the manual tagging, and it has a friendly auxiliary effect on Korean informatization and intelligentialize.