Multimodal Haptic Interfaces

Current haptic interface technologies exhibit restricted tactile rendering capabilities and often fail to integrate other sensory information. Our research focuses on developing innovative haptic interfaces that blend multimodal tactile cues and visuo-auditory information to create more realistic user experiences. 

Ongoing projects: 

Past projects: 

Related Publications

arxiv, 2024

[preprint]

Tactile Weight Rendering: A Review for Researchers and Developers

Haptic rendering of weight plays an essential role in naturalistic object interaction in virtual environments. While kinesthetic devices have traditionally been used for this aim by applying forces on the limbs, tactile interfaces acting on the skin have recently offered potential solutions to enhance or substitute kinesthetic ones. Here, we aim to provide an in-depth overview and comparison of existing tactile weight rendering approaches. We categorized these approaches based on their type of stimulation into asymmetric vibration and skin stretch, further divided according to the working mechanism of the devices. Then, we compared these approaches using various criteria, including physical, mechanical, and perceptual characteristics of the reported devices and their potential applications. We found that asymmetric vibration devices have the smallest form factor, while skin stretch devices relying on the motion of flat surfaces, belts, or tactors present numerous mechanical and perceptual advantages for scenarios requiring more accurate weight rendering. Finally, we discussed the selection of the proposed categorization of devices and their application scopes, together with the limitations and opportunities for future research. We hope this study guides the development and use of tactile interfaces to achieve a more naturalistic object interaction and manipulation in virtual environments.

Accepted to BioRob 2024

[preprint]

Design and evaluation of a multi-finger skin-stretch tactile interface for hand rehabilitation robots

Object properties perceived through the tactile sense, such as weight, friction, and slip, greatly influence motor control during manipulation tasks. However, the provision of tactile information during robotic training in neurorehabilitation has not been well explored. Therefore, we designed and evaluated a tactile interface based on a two-degrees-of-freedom moving platform mounted on a hand rehabilitation robot that provides skin stretch at four fingertips, from the index through the little finger. To accurately control the rendered forces, we included a custom magnetic-based force sensor to control the tactile interface in a closed loop. The technical evaluation showed that our custom force sensor achieved measurable shear forces of 8N with accuracies of 95.2--98.4% influenced by hysteresis, viscoelastic creep, and torsional deformation. The tactile interface accurately rendered forces with a step response steady-state accuracy of 97.5-99.4% and a frequency response in the range of most activities of daily living. Our sensor showed the highest measurement range-to-size ratio and comparable accuracy to sensors of its kind. These characteristics enabled the closed-loop force control of the tactile interface for precise rendering of multi-finger two-dimensional skin stretch. The proposed system is a first step towards more realistic and rich haptic feedback during robotic sensorimotor rehabilitation, potentially improving therapy outcomes.

Accepted to BioRob 2024

[preprint]

Relocating thermal stimuli to the proximal phalanx may not affect vibrotactile sensitivity on the fingertip

Wearable devices that relocate tactile feedback from fingertips can enable users to interact with their physical world augmented by virtual effects. While studies have shown that relocating same-modality tactile stimuli can influence the one perceived at the fingertip, the interaction of cross-modal tactile stimuli remains unclear. Here, we investigate how thermal cues applied on the index finger's proximal phalanx affect vibrotactile sensitivity at the fingertip of the same finger when employed at varying contact pressures. We designed a novel wearable device that can deliver thermal stimuli at adjustable contact pressures on the proximal phalanx. Utilizing this device, we measured the detection thresholds of fifteen participants for 250 Hz sinusoidal vibration applied on the fingertip while concurrently applying constant cold and warm stimuli at high and low contact pressures to the proximal phalanx. Our results revealed no significant differences in detection thresholds across conditions. These preliminary findings suggest that applying constant thermal stimuli to other skin locations does not affect fingertip vibrotactile sensitivity, possibly due to perceptual adaptation. However, the influence of dynamic multisensory tactile stimuli remains an open question for future research.

FeelPen: A Haptic Stylus Displaying Multimodal

Texture Feels on Touchscreens

The ever-emerging mobile market induced a blooming interest in stylus-based interactions. Most state-of-the-art styluses either provide no haptic feedback or only deliver one type of sensation, such as vibration or skin stretch. Improving these devices with display abilities of a palette of tactile feels can pave the way for rendering realistic surface sensations, resulting in more natural virtual experiences. However, integrating necessary actuators and sensors while keeping the compact form factor of a stylus for comfortable user interactions challenges their design. This situation also limits the scientific knowledge of relevant parameters for rendering compelling artificial textures for stylus-based interactions. To address these challenges, we developed FeelPen, a haptic stylus that can display multimodal texture properties (compliance, roughness, friction, and temperature) on touchscreens. We validated the texture rendering capability of our design by conducting system identification and psychophysical experiments. The experimental results confirmed that FeelPen could render a variety of modalities with wide parameter ranges necessary to create perceptually salient texture feels, making it a one-of-a-kind stylus. Our unique design and experimental results pave the way for new perspectives with stylus-based interactions on future touchscreens. 

ThermoSurf: Thermal display technology for dynamic and multi-finger interactions

Thermal feedback has been proven to enhance user experience in human-machine interactions. Yet state-of-the-art thermal technology has focused on the single finger or palm in static contact, overlooking dynamic and multi-finger interactions. The underlying challenges include incompatible designs of conventional interfaces for providing salient thermal stimuli for such interactions and, thereby, a lack of knowledge on human thermal perception for relevant conditions. Here we present the ThermoSurf, a new thermal display technology that can deliver temperature patterns on a large interface suitable for dynamic and multi-finger interactions. We also investigate how user exploration affects the perception of the generated temperature distributions. Twenty-three human participants interacted with the device following three exploration conditions (static-single finger, dynamic-single finger, and static-multi finger) and evaluated 15 temperature differences. Our results showed that human sensitivity against thermal stimuli is significantly greater for static-single finger contact compared to the other tested conditions. In addition, this interaction type resulted in higher thermal discrimination thresholds than the ones reported in the literature.