Purpose. We examined texture discrimination performance for classes of textures that isolate small-scale (``edge-based'') and large-scale (``region-based'') texture analysis mechanisms.
Methods. Each stimulus was a square field (8.8 deg wide) of randomly placed, blurred, short (0.2 deg) line segments (100% white-on-gray Weber contrast, 39 lines/deg2). Line segment orientation was chosen randomly using a Gaussian distribution. One such distribution determined the orientations on the left side of the square; a second distribution was used for the right side. In one temporal interval the two distributions were identical; in the other they differed. The task was to select the interval in which the left- and right-hand textures differed (2IFC with feedback). Four conditions were examined: (1) the two textures were separated by a central, uniform gray strip (0.7 deg wide); (2) the central strip was filled with randomly chosen lines (all orientations equally likely); (3) the central strip contained a smooth texture gradient (a ramp from the left-hand orientation distribution to the right-hand orientation distribution); and (4) the two textures met at an abrupt edge (randomly located within the strip). The eccentricity of the useful texture and edge information was nearly constant across conditions.
Results. For a texture difference in orientation variance (with identical means within a trial, although randomized means across trials), performance was no better with a texture edge (cond. 4) than when the texture edge was masked or smoothed (conds. 1-3). On the other hand, a texture difference in mean orientation led to superior performance for a texture ``edge'' (cond. 4).
Conclusions. These two patterns of results demonstrate the existence of two distinct mechanisms for texture discrimination. One is small-scale, and sensitive to the availability of a texture edge. The other is large-scale, pools over uniform texture regions, and is unperturbed by spurious edges between the textures to be compared.
None. Supported by NIH grant EY08266.