Human observers are known to frequently act like Bayes-optimal decision makers and there is growing evidence that the deployment of the visual system may similarly be driven by probabilistic mental models of the environment. We tested whether eye movements during a dynamic interception task were indeed optimised according to Bayesian inference principles. Forty-one participants intercepted oncoming balls in a virtual reality racquetball task across five counterbalanced conditions in which the relative probability of the onset location was manipulated. Analysis of pre-onset gaze positions indicated that eye position tracked the true distribution of onset location, indicating that the gaze system spontaneously adhered to environmental statistics. Eye position did not, however, minimise the distance between the target and foveal vision in a fully probabilistic way, and instead often reflected a ‘best guess’ about onset location. Trial-to-trial changes in gaze position were found to be better explained by Bayesian learning models (Hierarchical Gaussian Filter) than associative learning models. Additionally, parameters relating to the precision of beliefs and prediction errors extracted from the participant-wise models were related to both task-evoked pupil dilations and variability in gaze positions, providing further evidence that probabilistic context was reflected in spontaneous gaze dynamics.