Binocular Depth Perception

Title: Reconciling Pictorial Depth with Stereopsis

Author: Dhanraj Vishwanath

Affiliation: University of St Andrews, Scotland, UK.

 

Pictorial images are 2-dimensional surfaces with a coherent pattern of lightness and color contrast that typically yield an illusory impression of 3D shape and space.  This impression, however, is not as compelling as the impression of 3-dimensionality in stereoscopic images.  Similarly, the impression of depth and 3- dimensionality in real scenes viewing with one eye--or even with both eyes when the objects are at a far distance--is not as compelling as that perceived in real scenes viewed from a near distance with both eyes.   These less compelling instances of so-called “pictorial depth" are often believed to simply be high-level cognitive inferences of 3D shapes and distances based on visual cues, rather than constituting the perception of depth and 3-dimensionality per se.  Only those conditions that yield the compelling impression of depth called stereopsis--for example, due to the presence of binocular disparities or motion parallax--are conventionally thought to constitute depth perception proper.  I present evidence and analysis that challenge this simple dichotomy. I argue that pictorial depth is automatic and mandatory, and that stereopsis and pictorial depth are both ways of perceiving depth and 3-dimensionality but with different phenomenology and adaptive significance.

 


 

Title: Integration of Monocular and Binocular Cues to Depth Perception

Author: Christopher W. Tyle,

Affiliation: Smith-Kettlewell Eye Research Institute, San Francisco, USA. / Division of Optometry, City University of London, UK.

 

To specify the depth structure of the visual scene, we need to decode the depth information from a variety of depth cues, many of which are inherently sparse across space. All depth cues are sparse wherever the scene has uniform shading. Additionally, disparity and motion are sparse wherever a correspondence cannot be established. When sparse, the net perceived depth is subject to discontinuities between the depths specified by each cue unless they are all consistently scaled in both the absolute and relative depth parameters. Thus, the cue combination map must have a rescaling mechanism by which to combine the sparse cues for minimum mismatch in the resultant depth map. If this cue rescaling process relies on Bayesian estimation of the reliability of each cue, the reliability estimation must have a time course somewhere between neural response times and evolutionary time. The time course of depth cue rescaling was specified by matching with the exponential stereoscopic depth image of with the percept of a corresponding monocular depth image. The perceived depth dynamics had time constants up to several seconds that varied among cues, suggesting that the Bayesian reweighting based in reliability time course is of the order of perceptual adaptation times.

 


 

Title: Binocular Correlation and Matching Computations Determine Depth Perception in a Weighted Parallel Manner

Author: Ichiro Fujita

Affiliation: Osaka University/ Center for Information and Neural Networks, Osaka, Japan.

 

The first step to compute binocular disparity is achieved in the primary visual cortex by a process similar to calculation of local cross-correlation between left and right retinal images. The correlation-based neural signals convey information about false disparities as well as the true disparity. Processing at later stages then eliminates these false responses in the initial disparity detectors in order to encode only disparities of the features correctly matched between the two eyes. For a simple stimulus configuration, a feed-forward nonlinear process can transform the correlation signal into the match signal. Psychophysics in human observers suggests that depth judgement is determined by a weighted sum of the correlation and match signals rather than dependent entirely on the latter. The relative weight varies with spatial and temporal parameters of the stimuli, allowing adaptive recruitment of the two computations under different visual circumstances. A full transformation from correlation-based to match-based representation occurs at the neuronal population level in cortical area V4. Neurons in area V5/MT represent disparity in a manner intermediate between the correlation and match signals. We propose that the correlation and match signals in these areas contribute to depth perception in a weighted, parallel manner.

 


 

Title: Pooling in V1 Parameter Space Enhances Accuracy of Binocular Matching

Author: Izumi Ohzawa

Affiliation: Osaka University/ Center for Information and Neural Networks, Osaka, Japan.

 

The key problem of stereopsis is traditionally defined as accurately finding the positional offsets of corresponding object features between left and right images. Here, we demonstrate that the problem must be considered in a four-dimensional parameter space; with respect not only to shifts in position (X, Y), but also spatial frequency (SF) and orientation (OR). The proposed model pools outputs of binocular energy units linearly over the multi-dimensional V1 parameter space (X, Y, SF, OR). Contrary to a common expectation that pooling reduces neural selectivity, our theoretical analyses and physiological experiments show that many binocular neurons achieve sharpened binocular tuning properties by pooling the output of multiple neurons with relatively broad tuning. Pooling in the space domain sharpens disparity-selective responses in the SF domain so that the responses to combinations of unmatched left-right SFs are attenuated. Conversely, pooling in the SF domain sharpens disparity selectivity in the space domain, reducing the possibility of false matches. Analogous effects are observed for the OR domain in that the spatial pooling sharpens the binocular tuning in the OR domain. Such neurons can achieve sharpened tuning for relative orientation disparity. Therefore, pooling in V1 space enhances left-right matching accuracy of disparity-selective neurons.

 

Online Submission Registration Conference Program

 Important Dates

Call for abstracts:
Nov 15,2016

Symposium submission deadline:
Feb 28, 2017

Abstract submission deadline:
Mar 31, 2017 Apr 17, 2017

Early registration deadline:
Mar 31, 2017 Apr 30, 2017

All deadlines are midnight latest time zone on earth.