Automated container terminals are complex systems with multiple interactions and high dynamic characteristics. Integrated scheduling is expected to improve the overall efficiency. However, traditional optimization approaches such as mathematical models and meta-heuristic algorithms failed to tackle high dynamics. A reinforcement learning approach based on the scheduling network method is presented in this paper. Network-based heuristic rules are introduced into the action space, and a novel state definition that integrates local and global information about the scheduling problem is proposed. Group training and group validating strategies are adopted to test the generalization ability. Numerical experiment results reveal that the proposed approach converges to a high level and maintains good performance on unseen instances. Compared to the selected heuristic rules, the proposed method achieves 2.37% and 6.06% better results on training and test instances, respectively.