Texture Encoding & Rendering

While interacting with physical objects, humans experience rich tactile sensations, including texture, friction, roughness, temperature, and deformability. We explore the intricate mechanisms underlying human tactile texture perception and how to translate these rich tactile cues into digital representations that can be captured, encoded, and displayed effectively. 

Ongoing projects: 

Related Publications

Accepted to Eurohaptics 2024

[preprint] [website]

SENS3: Multisensory Database of Finger-Surface Interactions and Corresponding Sensations

The growing demand for natural interactions with technology underscores the importance of achieving realistic touch sensations in digital environments. Realizing this goal highly depends on comprehensive databases of finger-surface interactions, which need further development. Here, we present SENS3, an extensive open-access repository of multisensory data acquired from fifty surfaces when two participants explored them with their fingertips through static contact, pressing, tapping, and sliding. SENS3 encompasses high-fidelity visual, audio, and haptic information recorded during these interactions, including videos, sounds, contact forces, torques, positions, accelerations, skin temperature, heat flux, and surface photographs. Additionally, it incorporates thirteen participants' psychophysical sensation ratings while exploring these surfaces freely. We anticipate that SENS3 will be valuable for advancing multisensory texture rendering, user experience development, and touch sensing in robotics.

IEEE Transactions on Haptics, 2023

[paper] [code] [data]

IEEE World Haptics Conference 2021, Interactive Demonstration & Work-in-Progress Paper

Preserving Texture Realism Across Remote Actuator Placement and Variable Fingertip Velocity

Wearable haptic displays that relocate feedback away from the fingertip provide a much-needed sense of touch to interactions in virtual reality, while also leaving the fingertip free from occlusion for augmented reality tasks. However, the impact of relocation on perceptual sensitivity to dynamic changes in actuation during active movement remains unclear. In this work, we investigate the perceived realism of virtual textures rendered via vibrations relocated to the base of the index finger and com-pare three different methods of modulating vibrations with active finger speed. For the first two methods, changing finger speed induced proportional changes in either frequency or amplitude of vibration, and for the third method did not modulate vibration. In psychophysical experiments, participants compared different types of modulation to each other, as well as to real 3D-printed textured surfaces. Results suggest that frequency modulation results in more realistic sensations for coarser textures, whereas participants were less discerning of modulation type for finer textures. Additionally, we presented virtual textures either fully virtual in midair or under augmented reality in which the finger contacted a flat surface; while we found no difference in experimental performance, participants were divided by a strong preference for either the contact or non-contact condition.

FeelPen: A Haptic Stylus Displaying Multimodal

Texture Feels on Touchscreens

The ever-emerging mobile market induced a blooming interest in stylus-based interactions. Most state-of-the-art styluses either provide no haptic feedback or only deliver one type of sensation, such as vibration or skin stretch. Improving these devices with display abilities of a palette of tactile feels can pave the way for rendering realistic surface sensations, resulting in more natural virtual experiences. However, integrating necessary actuators and sensors while keeping the compact form factor of a stylus for comfortable user interactions challenges their design. This situation also limits the scientific knowledge of relevant parameters for rendering compelling artificial textures for stylus-based interactions. To address these challenges, we developed FeelPen, a haptic stylus that can display multimodal texture properties (compliance, roughness, friction, and temperature) on touchscreens. We validated the texture rendering capability of our design by conducting system identification and psychophysical experiments. The experimental results confirmed that FeelPen could render a variety of modalities with wide parameter ranges necessary to create perceptually salient texture feels, making it a one-of-a-kind stylus. Our unique design and experimental results pave the way for new perspectives with stylus-based interactions on future touchscreens. 

IEEE Transactions on Haptics, 2022

[paper] [code] [data]

Learning to Feel Textures: Predicting Perceptual Similarities from Unconstrained Finger-Surface Interactions

Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity.  

Both vision and touch contribute to the perception of real surfaces. Although there have been many studies on the individual contributions of each sense, it is still unclear how each modality’s information is processed and integrated. Here, we investigated the similarity of visual and haptic perceptual spaces, as well as how well they each correlate with fingertip interaction metrics. We found that real surfaces are similarly organized within the three-dimensional perceptual spaces of both modalities. These dimensions for each modality can be represented by roughness/smoothness, hardness/softness, and friction, and that these dimensions can be estimated by surface vibration power, tap spectral centroid, and kinetic friction coefficient, respectively.

A Novel Texture Rendering Approach for Electrostatic Displays

Generating realistic texture feelings on tactile displays using data-driven methods has attracted a lot of interest in the last decade. However, the need for large data storage and transmission rates complicates the use of these methods for future commercial displays. Here, we propose a new texture rendering approach that can compress the texture data significantly for electrostatic displays. Using three sample surfaces, we first explain how to record, analyze and compress the texture data, and render them on a touchscreen. Then, through psychophysical experiments conducted with nineteen participants, we show that the textures can be reproduced by a significantly fewer number of frequency components than the ones in the original signal without inducing perceptual degradation. 

Realistic display of tactile textures on touch screens is a big step forward for haptic technology to reach a wide range of consumers utilizing electronic devices on a daily basis. Since the texture topography cannot be rendered explicitly by electrovibration on touch screens, it is important to understand how we perceive the virtual textures displayed by friction modulation via electrovibration. Here, we investigated the roughness perception of real gratings made of plexiglass and virtual gratings displayed by electrovibration through a touch screen for comparison. The results showed that the roughness perception of real and virtual gratings are different. We argue that this difference can be explained by the amount of fingerpad penetration into the gratings. For real gratings, penetration increased tangential forces acting on the finger, whereas for virtual ones where skin penetration is absent, tangential forces decreased with spatial period.