On Probing Appearance:

Testing material-lighting interactions in an image-based canonical approach

Screenshot 2020-07-30 at 17.14.03


Materials are omnipresent. Recognising materials helps us with inferring their physical and chemical properties, for instance if they are compressible, slippery, sweet and juicy. Yet in literature, much less attention has been paid to material perception than to object perception. This dissertation presents studies on a method to systematically measure human visual perception of opaque materials and test the influence of lighting and shape on material perception.

In our studies, we applied multiple psychophysical methods such as matching, discriminating, and perceptual scaling to test the visual perception of materials for human observers.

Combining the four material modes and three lighting modes, we presented a canonical set that in combination with optical mixing supports a painterly approach in which key image features could be varied directly. With this method we were able to test and predict light-material interactions using both photographs of the real objects and computer rendered stimuli.

To conclude, our research mainly contributed to 1) the development of a novel probing method that mixes image features of the proximal stimulus in a fluent manner instead of varying the distal physical properties of the stimuli, plus a validation that it works and that it allows quantitative measurements of material perception and material-lighting interactions; 2) understanding of visual perception of opaque materials and material-light-interactions in a wide ecological variety; 3) a validated model for predicting the material dependent lighting effects for matte, specular, velvet and glittery materials; and 4) the interpretations of the material perception results in a manner relating to shape and light. Our findings can be further applied to many subjects, such as industrial design, education, e-commerce, computer graphics, and future psychophysical studies.


For more details and reference, please use TU Delft repository.