We develop a versatile deep neural network architecture, called Lyapunov-Net, to approximate Lyapunov functions of dynamical systems in high dimensions. Lyapunov-Net guarantees positive definiteness, and thus can be easily trained to satisfy the negative orbital derivative condition, which only renders a single term in the empirical risk function in practice. This significantly simplifies parameter tuning and results in greatly improved convergence during network training and approximation quality. We also provide comprehensive theoretical justifications on the approximation accuracy and certification guarantees of Lyapunov-Nets. We demonstrate the efficiency of the proposed method on nonlinear dynamical systems in high dimensional state spaces, and show that the proposed approach significantly outperforms the state-of-the-art methods.