Agricultural robotics and precision farming have garnered increasing attention, particularly with the developing of autonomous robot sprayers based on visual Simultaneous Localization and Mapping (SLAM). Orchard navigation presents a formidable challenge due to GNSS signal interruptions caused by the dense tree canopy, high-cost system infrastructure when using a 3D Lidar-based sensor, and the system’s inherent complexity. Simultaneously, the spraying system must determine the spray amount based on canopy size to enhance efficiency. This study employed visual SLAM based on a Real-Time Appearance-Based Map (RTAB-Map), using a depth camera as a single sensor to plan the robot's navigation path within a specially designed artificial tree-based orchard system to overcome these disadvantages. The orchard navigation path was established by creating a local map using 3D mapping and robot position estimation derived from the obtained depth data. Additionally, a differential-drive robot was utilized as an experimental platform to achieve obstacle avoidance performance and navigation accuracy in various environmental conditions. In the navigation accuracy experiment, an average navigation error was expected within a 15 cm range in outdoor and indoor experiments. The proposed prototype system is poised to be implemented for pesticide spraying in orchard fields on an industrial robotic platform.