Research Highlights

Published Research

Tactile Weight Rendering: A Review for Researchers and Developers

Haptic rendering of weight plays an essential role in naturalistic object interaction in virtual environments. While kinesthetic devices have traditionally been used for this aim by applying forces on the limbs, tactile interfaces acting on the skin have recently offered potential solutions to enhance or substitute kinesthetic ones. Here, we aim to provide an in-depth overview and comparison of existing tactile weight rendering approaches. We categorized these approaches based on their type of stimulation into asymmetric vibration and skin stretch, further divided according to the working mechanism of the devices. Then, we compared these approaches using various criteria, including physical, mechanical, and perceptual characteristics of the reported devices and their potential applications. We found that asymmetric vibration devices have the smallest form factor, while skin stretch devices relying on the motion of flat surfaces, belts, or tactors present numerous mechanical and perceptual advantages for scenarios requiring more accurate weight rendering. Finally, we discussed the selection of the proposed categorization of devices and their application scopes, together with the limitations and opportunities for future research. We hope this study guides the development and use of tactile interfaces to achieve a more naturalistic object interaction and manipulation in virtual environments.

SENS3: Multisensory Database of Finger-Surface Interactions and Corresponding Sensations

The growing demand for natural interactions with technology underscores the importance of achieving realistic touch sensations in digital environments. Realizing this goal highly depends on comprehensive databases of finger-surface interactions, which need further development. Here, we present SENS3, an extensive open-access repository of multisensory data acquired from fifty surfaces when two participants explored them with their fingertips through static contact, pressing, tapping, and sliding. SENS3 encompasses high-fidelity visual, audio, and haptic information recorded during these interactions, including videos, sounds, contact forces, torques, positions, accelerations, skin temperature, heat flux, and surface photographs. Additionally, it incorporates thirteen participants' psychophysical sensation ratings while exploring these surfaces freely. We anticipate that SENS3 will be valuable for advancing multisensory texture rendering, user experience development, and touch sensing in robotics.

Design and evaluation of a multi-finger skin-stretch tactile interface for hand rehabilitation robots

Object properties perceived through the tactile sense, such as weight, friction, and slip, greatly influence motor control during manipulation tasks. However, the provision of tactile information during robotic training in neurorehabilitation has not been well explored. Therefore, we designed and evaluated a tactile interface based on a two-degrees-of-freedom moving platform mounted on a hand rehabilitation robot that provides skin stretch at four fingertips, from the index through the little finger. To accurately control the rendered forces, we included a custom magnetic-based force sensor to control the tactile interface in a closed loop. The technical evaluation showed that our custom force sensor achieved measurable shear forces of 8N with accuracies of 95.2--98.4% influenced by hysteresis, viscoelastic creep, and torsional deformation. The tactile interface accurately rendered forces with a step response steady-state accuracy of 97.5-99.4% and a frequency response in the range of most activities of daily living. Our sensor showed the highest measurement range-to-size ratio and comparable accuracy to sensors of its kind. These characteristics enabled the closed-loop force control of the tactile interface for precise rendering of multi-finger two-dimensional skin stretch. The proposed system is a first step towards more realistic and rich haptic feedback during robotic sensorimotor rehabilitation, potentially improving therapy outcomes.

arXiv 

[preprint]

Relocating thermal stimuli to the proximal phalanx may not affect vibrotactile sensitivity on the fingertip

Wearable devices that relocate tactile feedback from fingertips can enable users to interact with their physical world augmented by virtual effects. While studies have shown that relocating same-modality tactile stimuli can influence the one perceived at the fingertip, the interaction of cross-modal tactile stimuli remains unclear. Here, we investigate how thermal cues applied on the index finger's proximal phalanx affect vibrotactile sensitivity at the fingertip of the same finger when employed at varying contact pressures. We designed a novel wearable device that can deliver thermal stimuli at adjustable contact pressures on the proximal phalanx. Utilizing this device, we measured the detection thresholds of fifteen participants for 250 Hz sinusoidal vibration applied on the fingertip while concurrently applying constant cold and warm stimuli at high and low contact pressures to the proximal phalanx. Our results revealed no significant differences in detection thresholds across conditions. These preliminary findings suggest that applying constant thermal stimuli to other skin locations does not affect fingertip vibrotactile sensitivity, possibly due to perceptual adaptation. However, the influence of dynamic multisensory tactile stimuli remains an open question for future research.

IEEE Transactions on Haptics, 2023

[paper] [code] [data]

IEEE World Haptics Conference 2021, Interactive Demonstration & Work-in-Progress Paper

Preserving Texture Realism Across Remote Actuator Placement and Variable Fingertip Velocity

Wearable haptic displays that relocate feedback away from the fingertip provide a much-needed sense of touch to interactions in virtual reality, while also leaving the fingertip free from occlusion for augmented reality tasks. However, the impact of relocation on perceptual sensitivity to dynamic changes in actuation during active movement remains unclear. In this work, we investigate the perceived realism of virtual textures rendered via vibrations relocated to the base of the index finger and com-pare three different methods of modulating vibrations with active finger speed. For the first two methods, changing finger speed induced proportional changes in either frequency or amplitude of vibration, and for the third method did not modulate vibration. In psychophysical experiments, participants compared different types of modulation to each other, as well as to real 3D-printed textured surfaces. Results suggest that frequency modulation results in more realistic sensations for coarser textures, whereas participants were less discerning of modulation type for finer textures. Additionally, we presented virtual textures either fully virtual in midair or under augmented reality in which the finger contacted a flat surface; while we found no difference in experimental performance, participants were divided by a strong preference for either the contact or non-contact condition.

IEEE Transactions on Haptics, 2023

[paper] [code] [data]

Focused vibrotactile stimuli from a wearable sparse array of actuators

Wearable vibrotactile actuators are non-intrusive and inexpensive means to provide haptic feedback directly to the user's skin. Complex spatiotemporal stimuli can be achieved by combining multiple of these actuators, using the funneling illusion. This illusion can funnel the sensation to a particular position between the actuators, thereby creating virtual actuators. However, using the funneling illusion to create virtual actuation points is not robust and leads to sensations that are difficult to locate. We postulate that poor localization can be improved by considering the dispersion and attenuation of the wave propagation on the skin. We used the inverse filter technique to compute the delays and amplification of each frequency to correct the distortion and create sharp sensations that are easier to detect. We developed a wearable device stimulating the volar surface of the forearm composed of four independently controlled actuators. A psychophysical study involving twenty participants showed that the focused sensation improves confidence in the localization by 20% compared to the non-corrected funneling illusion. We anticipate our results to improve the control of wearable vibrotactile devices used for emotional touch or tactile communication. 

FeelPen: A Haptic Stylus Displaying Multimodal

Texture Feels on Touchscreens

The ever-emerging mobile market induced a blooming interest in stylus-based interactions. Most state-of-the-art styluses either provide no haptic feedback or only deliver one type of sensation, such as vibration or skin stretch. Improving these devices with display abilities of a palette of tactile feels can pave the way for rendering realistic surface sensations, resulting in more natural virtual experiences. However, integrating necessary actuators and sensors while keeping the compact form factor of a stylus for comfortable user interactions challenges their design. This situation also limits the scientific knowledge of relevant parameters for rendering compelling artificial textures for stylus-based interactions. To address these challenges, we developed FeelPen, a haptic stylus that can display multimodal texture properties (compliance, roughness, friction, and temperature) on touchscreens. We validated the texture rendering capability of our design by conducting system identification and psychophysical experiments. The experimental results confirmed that FeelPen could render a variety of modalities with wide parameter ranges necessary to create perceptually salient texture feels, making it a one-of-a-kind stylus. Our unique design and experimental results pave the way for new perspectives with stylus-based interactions on future touchscreens. 

ThermoSurf: Thermal display technology for dynamic and multi-finger interactions

Thermal feedback has been proven to enhance user experience in human-machine interactions. Yet state-of-the-art thermal technology has focused on the single finger or palm in static contact, overlooking dynamic and multi-finger interactions. The underlying challenges include incompatible designs of conventional interfaces for providing salient thermal stimuli for such interactions and, thereby, a lack of knowledge on human thermal perception for relevant conditions. Here we present the ThermoSurf, a new thermal display technology that can deliver temperature patterns on a large interface suitable for dynamic and multi-finger interactions. We also investigate how user exploration affects the perception of the generated temperature distributions. Twenty-three human participants interacted with the device following three exploration conditions (static-single finger, dynamic-single finger, and static-multi finger) and evaluated 15 temperature differences. Our results showed that human sensitivity against thermal stimuli is significantly greater for static-single finger contact compared to the other tested conditions. In addition, this interaction type resulted in higher thermal discrimination thresholds than the ones reported in the literature. 

IEEE Transactions on Haptics, 2022

[paper] [code] [data]

Learning to Feel Textures: Predicting Perceptual Similarities from Unconstrained Finger-Surface Interactions

Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity.  

Contact Evolution of Dry and Hydrated Fingertips at Initial Touch 

Pressing the fingertips into surfaces causes skin deformations that enable humans to grip objects and sense their physical properties. This process involves intricate finger geometry, non-uniform tissue properties, and moisture, complicating the underlying contact mechanics.  Here, we explore the initial contact evolution of dry and hydrated fingers to isolate the roles of governing physical factors. Two participants gradually pressed an index finger on a glass surface under three moisture conditions:  dry, water-hydrated, and glycerin-hydrated.  Gross and real contact areas were optically measured over time, revealing that glycerin hydration produced strikingly higher real contact area, while gross contact area was similar for all conditions. To elucidate the causes for this phenomenon, we investigated the combined effects of tissue elasticity, skin-surface friction, and fingerprint ridges on contact area using simulation.  Our analyses show the dominant influence of elastic modulus over friction and an unusual contact phenomenon, which we call friction-induced hinging.

Finger Motion and Contact by a Second Finger Influence the Tactile Perception of Electrovibration

Electrovibration holds great potential for creating vivid and realistic haptic sensations on touchscreens. Ideally, a designer should be able to control what users feel independent of the number of fingers they use, the movements they make, and how hard they press. We proved for the first time that both the number of contacting fingers and whether each finger moves significantly affect what the user feels. Part of this change comes from the fact that relative motion greatly increases the electrical impedance between a finger and the screen. These findings help lay the groundwork for delivering consistent haptic feedback via electrovibration.

Certain ungrounded asymmetric vibrations create a unidirectional force that makes the user feel as though their fingers are being pulled in a particular direction. However, although researchers have discovered this haptic feedback technique and showcased its success in a variety of applications, there is still little understanding of how different attributes of the asymmetric vibration signal affect the perceived pulling sensation. This ongoing work aims to use dynamic modeling and measurement to bridge this gap between the design of the control signals and human perception. 

- Principal investigator's earlier works related to current research topics at HITLab -

Physical Variables Underlying Tactile Stickiness During Fingerpad Detachment

One may notice a relatively wide range of tactile sensations even when touching the same hard, flat surface in similar ways. Little is known about the reasons for this variability, so here we investigated how the perceptual intensity of light stickiness relates to the physical interaction between the skin and the surface. Our results show that stickiness perception mainly depends on the pre-detachment pressing duration, the time taken for the finger to detach, and the impulse in the normal direction after the normal force changes sign; finger-surface adhesion seems to build with pressing time, causing a larger normal impulse during detachment and thus a more intense stickiness sensation. 

Effective utilization of electrovibration can only be accomplished by simultaneously investigating both the physical and perceptual aspects of the finger-touchscreen interaction. Towards this goal, present work blends the available knowledge on electromechanical properties of the human finger and human tactile perception with the results of new psychophysical experiments and physical measurements. By following such an approach that combines both theoretical and experimental information, the study proposes new methods and insights on generating realistic haptic effects, such as textures and edges on these displays. 

Both vision and touch contribute to the perception of real surfaces. Although there have been many studies on the individual contributions of each sense, it is still unclear how each modality’s information is processed and integrated. Here, we investigated the similarity of visual and haptic perceptual spaces, as well as how well they each correlate with fingertip interaction metrics. We found that real surfaces are similarly organized within the three-dimensional perceptual spaces of both modalities. These dimensions for each modality can be represented by roughness/smoothness, hardness/softness, and friction, and that these dimensions can be estimated by surface vibration power, tap spectral centroid, and kinetic friction coefficient, respectively.

A Novel Texture Rendering Approach for Electrostatic Displays

Generating realistic texture feelings on tactile displays using data-driven methods has attracted a lot of interest in the last decade. However, the need for large data storage and transmission rates complicates the use of these methods for future commercial displays. Here, we propose a new texture rendering approach that can compress the texture data significantly for electrostatic displays. Using three sample surfaces, we first explain how to record, analyze and compress the texture data, and render them on a touchscreen. Then, through psychophysical experiments conducted with nineteen participants, we show that the textures can be reproduced by a significantly fewer number of frequency components than the ones in the original signal without inducing perceptual degradation. 

Effect of Remote Masking on Detection of Electrovibration

Here, we investigated whether it is possible to change detection threshold of electrovibration at fingertip of index finger via remote masking, i.e. by applying a (mechanical) vibrotactile stimulus on the proximal phalanx of the same finger. We found that vibrotactile masking stimuli generated sub-threshold vibrations around fingertip, and hence did not mechanically interfere with the electrovibration stimulus. However, there was a clear psychophysical masking effect due to central neural processes. Electrovibration absolute threshold increased approximately 0.19 dB for each dB increase in the masking level.

Realistic display of tactile textures on touch screens is a big step forward for haptic technology to reach a wide range of consumers utilizing electronic devices on a daily basis. Since the texture topography cannot be rendered explicitly by electrovibration on touch screens, it is important to understand how we perceive the virtual textures displayed by friction modulation via electrovibration. Here, we investigated the roughness perception of real gratings made of plexiglass and virtual gratings displayed by electrovibration through a touch screen for comparison. The results showed that the roughness perception of real and virtual gratings are different. We argue that this difference can be explained by the amount of fingerpad penetration into the gratings. For real gratings, penetration increased tangential forces acting on the finger, whereas for virtual ones where skin penetration is absent, tangential forces decreased with spatial period. 

Tactile Masking by Electrovibration

Future touch screen applications will include multiple tactile stimuli displayed simultaneously or consecutively to a single finger or multiple fingers. These applications should be designed by considering human tactile masking mechanism since it is known that presenting one stimulus may interfere with the perception of the other. Here, we investigate the effect of masking on tactile perception of electrovibration displayed on touch screens. Moreover, in order to investigate the effect of tactile masking on our haptic perception of edge sharpness, we compared the perceived sharpness of edges separating two textured regions displayed with and without various masking stimuli. Our results suggest that sharpness perception depends on the local contrast between background and foreground stimuli, which varies as a function of masking amplitude and activation levels of frequency-dependent psychophysical channels.

Effect of Waveform on Tactile Perception by Electrovibration Displayed on Touchscreens

Here, we investigated the effect of input voltage waveform on our haptic perception of electrovibration on touch screens. We found that the subjects were more sensitive to stimuli generated by square wave voltage than sinusoidal one for frequencies lower than 60 Hz. Using Matlab simulations, we showed that the sensation difference of waveforms in low fundamental frequencies occurred due to the frequency-dependent electrical properties of human skin and human tactile sensitivity. To validate our simulations, we actuated the touch screen at the threshold voltages and then measured the contact force and acceleration acting on the index fingers of the subjects moving on the screen with a constant speed. We analyzed the collected data in the frequency domain using the human vibrotactile sensitivity curve. The results suggested that Pacinian channel was the primary psychophysical channel in the detection of the electrovibration stimuli caused by all the square-wave inputs tested in this study.