Most human-aware navigation methods usually gather human features like: speed, gesture, and other human characteristics to establish a comfortable space. However, they do not consider the direction of human vision, which often results in the intrusion of human gaze space by robots. This will bring a strong sense of discomfort to humans. This paper proposes a navigation framework based on an asymmetric Gaussian model and pedestrian trajectory prediction. Firstly, the robot’s RGB-D camera and laser sensors are used to detect the pedestrian’s speed, gaze direction, and other personal information, and subsequently, multiple Gaussian functions are used to establish an asymmetric human comfort space. Then, the detected pedestrian speed is used to predict the trajectory of pedestrians, and the dynamic cost map is obtained through the hierarchical cost map mechanism so that the time information can be considered while planning the trajectory of the robot. Finally, A* algorithm based on lattice state search is used for trajectory planning to establish a trajectory that is consistent in the consideration of human comfort and robot dynamic constraints.