The aim of this project is to explore the role of tactile signals in shaping the psychophysical sensation of materials, and consequently, their contribution to material recognition using an artificial intelligence model.
When humans interact with material surfaces through touch (e.g., sliding, tapping, pressing), they perceive multi-dimensional attributes like smoothness and softness, enabling material recognition. However, how tactile signals rapidly translate into perceptual understanding remains unclear, limiting touch integration in digital worlds. Our work aims to bridge the gap by combining human intelligence and AI to uncover tactile perception mechanisms by developing three interconnected models that progressively decode tactile information. Our methodology will advance scientific understanding of human touch perception and reveal differences not only in the final decisions but also in the decision-making processes between AI models guided by human insights and those developed without such guidance.
Members
Li Zou
Postdoctoral Researcher
MSc. Student
Yasemin Vardar
Principle Investigator
Funding Information
NWO-VENI
Related Other Funding Information
Delft Technology Fellowship
IEEE Transactions on Haptics, 2022
[paper] [code] [data]
Learning to Feel Textures: Predicting Perceptual Similarities from Unconstrained Finger-Surface Interactions
Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity.
Both vision and touch contribute to the perception of real surfaces. Although there have been many studies on the individual contributions of each sense, it is still unclear how each modality’s information is processed and integrated. Here, we investigated the similarity of visual and haptic perceptual spaces, as well as how well they each correlate with fingertip interaction metrics. We found that real surfaces are similarly organized within the three-dimensional perceptual spaces of both modalities. These dimensions for each modality can be represented by roughness/smoothness, hardness/softness, and friction, and that these dimensions can be estimated by surface vibration power, tap spectral centroid, and kinetic friction coefficient, respectively.
Generating realistic texture feelings on tactile displays using data-driven methods has attracted a lot of interest in the last decade. However, the need for large data storage and transmission rates complicates the use of these methods for future commercial displays. Here, we propose a new texture rendering approach that can compress the texture data significantly for electrostatic displays. Using three sample surfaces, we first explain how to record, analyze and compress the texture data, and render them on a touchscreen. Then, through psychophysical experiments conducted with nineteen participants, we show that the textures can be reproduced by a significantly fewer number of frequency components than the ones in the original signal without inducing perceptual degradation.