Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published June 1989 | Published
Book Section - Chapter Open

A computational model of texture segmentation

Abstract

An algorithm for finding texture boundaries in images is developed on the basis of a computational model of human texture perception. The model consists of three stages: (1) the image is convolved with a bank of even-symmetric linear filters followed by half-wave rectification to give a set of responses; (2) inhibition, localized in space, within and among the neural response profiles results in the suppression of weak responses when there are strong responses at the same or nearby locations; and (3) texture boundaries are detected using peaks in the gradients of the inhibited response profiles. The model is precisely specified, equally applicable to grey-scale and binary textures, and is motivated by detailed comparison with psychophysics and physiology. It makes predictions about the degree of discriminability of different texture pairs which match very well with experimental measurements of discriminability in human observers. From a machine-vision point of view, the scheme is a high-quality texture-edge detector which works equally on images of artificial and natural scenes. The algorithm makes the use of simple local and parallel operations, which makes it potentially real-time.

Additional Information

© 1989 IEEE. Date of Current Version: 06 August 2002. We thank Martin Banks, Bela Julesz and Paul Kube for their useful comments. In particular, we would like to thank Dr. Julesz for suggesting the comparison of our model with Krose's psychophysical data. This research was funded by an IBM Faculty Development award and by DARPA contract N00039-88-C-0292.

Attached Files

Published - MALcvpr89.pdf

Files

MALcvpr89.pdf
Files (696.9 kB)
Name Size Download all
md5:c1139b20f5536e73b5fcafe56997fb61
696.9 kB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 17, 2023