A neural model of high-acuity vision in the presence of fixational eye movements
- Resource Type
- Conference
- Authors
- Anderson, Alexander G.; Olshausen, Bruno A.; Ratnam, Kavitha; Roorda, Austin
- Source
- 2016 50th Asilomar Conference on Signals, Systems and Computers Signals, Systems and Computers, 2016 50th Asilomar Conference on. :588-592 Nov, 2016
- Subject
- Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Signal Processing and Analysis
Retina
Mathematical model
Neurons
Silicon
Lattices
Bayes methods
Brain modeling
- Language
Experiments by Ratnam et al.[1] demonstrate the benefit of drift eye movements for the discrimination of a diffraction-limited tumbling E sized near the sampling limit of the cone photoreceptor array. Subjects perform better at discriminating the orientation of the E when its projection moves on the retina with the same motion statistics as drift eye movements, but not necessarily correlated to the true eye motion. In order to better understand the neural circuitry that underlies these psychophysical results, we propose a computational model based on a Bayesian ideal observer that attempts to estimate the spatial pattern on the retina given simulated RGC spikes. Our Bayesian model both corroborates the psychophysical measurements and suggests a neural mechanism. We extend previous work by Burak et al.[2] by creating a novel, online approximation to the expectation-maximization algorithm that generalizes to the case of continuous eye movements and sparse pattern priors. From this emerges a neural model containing two populations of cells which we hypothesize to exist in primary visual cortex: one that encodes the spatial pattern using a sparse code and another that tracks the eye position and is used to dynamically route information coming from LGN afferents feeding into the pattern cells.