In recent years, various forms of energy such as electric and thermal energy have gradually shifted from independent planning and operation of each subsystem to multi-system joint operations. The coordinate energy scheduling technology developed in multi-energy systems (MESs) has been developed continuously, which improves energy efficiency and lowers the overall energy bill. However, there exist a variety of uncertainties, such as weather factors and randomness on both sides of energy supply and demand happening in renewable energy, which challenges the cooperative energy scheduling in MESs. In order to address these issues, this article first establishes a multi-energy system integrated with various energy devices and formulates the energy management problem as a Markov Decision Process (MDP). Then, with the goal of minimizing the energy cost and carbon emission cost, a deep reinforcement learning (DRL) algorithm, i.e., twin delayed deterministic policy gradient (TD3), is used to coordinate the scheduling of electric and thermal energy, as well as effectively deal with the uncertainties. Finally, simulation experiments are conducted to verify the effectiveness of the TD3 algorithm compared with a benchmark DRL algorithm in terms of system cost and carbon emission.