In teleoperated robots, such as surgical robots, there is a desire to infer the intent of the operator and provide assistance as needed. This lofty goal is especially challenging when it comes to long-horizon inference. In this paper, we propose leveraging a Transformer-based model to predict the long-horizon trajectory of the master-side manipulators of the da Vinci surgical robot, while also investigating the role of trajectory-based haptic guidance cues as potentially assistive cues. Using the JIGSAW dataset, our model achieved an RMSE Cartesian error of 26.14mm when using the provided gesture labels and 32.13mm without gesture labels for master-side manipulators 1-second-ahead trajectory prediction. We then created resistive and assistive haptic guidance cues with a virtual spring between the current manipulator position and prior or future predicted positions, respectively. Each condition consisted of two levels, defined by 0.5s and Is time horizons. We conducted a preliminary human subject study with 10 subjects to investigate the role of these guidance forces on completion time for a running suturing task. While there are no statistically significant time differences based on type of haptic cue and time-horizon, we observed that the long-horizon resistive guidance had weak significance to improve the mean task performance in a washout trial that immediately followed the haptic condition. We also observed a large decrease in user difficulty ratings for this trial. These results indicate that haptic guidance cues could be leveraged in surgical training, potentially resulting in lasting after-effects on performance once the guidance has been removed.