Modeling the Discrimination of Orientation-Defined Texture Edges

S. Sabina Wolfson & Michael S. Landy

Department of Psychology and Center for Neural Science, New York University, New York, NY

Invest Ophth & Vis Sci (1994), 35(4), abstract #1907, page S1667


Purpose. We previously (ARVO, 1993) examined factors governing texture segregation performance for edges defined solely by a change of orientation. We are now evaluating a model to account for those data.

Methods. Each stimulus was a circular field (diam: 12.2 deg) of randomly placed, blurred, short (len: 0.3 deg) line segments. A vertical (q = 0°) edge that was either ``straight'' or ``wavy'' (sinusoidal) separated the field into two differently textured regions. The line orientation was qL on the left side, qR on the right, and the orientation difference across the edge was Dq = |qR - qL|. The subject's task was to discriminate between wavy and straight edges (2AFC).

Model. In the initial stage of the model, each stimulus is passed through a bank of orientation and spatial frequency selective linear filters (half-height bandwidths of 1 octave and 30° followed by a nonlinearity (x2). This converts a texture edge into a noisy intensity edge. The decision rule is based on edge enhancement of these processed images followed by cross-correlation with straight and wavy edge templates.

Results. For both human subjects and the model, at qL = 0° or 90°, best performance occurs when Dq = 90° (the maximum possible orientation difference). For human subjects, at fixed Dq = 90° and varied qL, the performance at qL = 0° and at 90° (when the line segments are parallel and perpendicular to the edge) is significantly better than when qL is oblique. In addition, for human subjects, at qL = 135°, best performance is obtained when qR = 0°.

Conclusions. A simple filtering model can account for the performance for fixed qL = 0° or 90°. Modifications of the model required to be consistent with the other aspects of the data will be discussed.


None. Supported by NIH grant EY08266.