This paper presents an intuitive interface for the human-manipulator teleoperation system to enhance the operating intuition and immersiveness with mixed reality (MR). Acquiring the pose of the operator’s hand via IMUs, the proposed control method enables the operator to control the end-effector pose of the manipulator directly and intuitively by hand posture. RBFNN (Radial Basis Function Neural Network) is utilized to deal with dynamics uncertainties, and the null-space projection method is implemented for the redundant manipulator to perform self-motion control. Additionally, MR-integrated visual feedback provides an immersive experience for telemanipulation via the head-mounted display device. The stability of this control structure has been proved using Lyapunov stability theorem, and the effectiveness has been validated by experiments with a 7-DoF manipulator.