You are here

Haptic rendering of textures

Tutorial

This half-day Sunday afternoon (February 23, 2014) tutorial will overview the problem of haptic texture rendering and then carefully explain a new set of methods the presenters have developed for creating highly realistic haptic virtual textures. While some of the discussion will be relevant to bare-finger haptic interactions, we will focus on situations where the user touches the surface through a rigid tool, as shown in the image below. Interestingly, even though the skin is not in contact with the surface, humans can perceive many properties of a texture by dragging a rigid tool across it. Such interactions frequently arise in the areas of art, design, manufacturing, and medicine, as well as in everyday tasks such as writing a grocery list.

Tool-texture interaction

Background: Real surfaces include small-scale features that interact with the tip of the tool through impacts and frictional contact. Including textures has the potential to increase the realism of haptic virtual environments, but they must be implemented in a way that respects the software and hardware capabilities of haptic interface systems. Researchers have thus developed several different approaches to haptic texture rendering, which we will summarize.

Modeling: Our research in this area has been motivated by a desire to create virtual textures that perfectly mimic the feel of specific real textures. We have thus adopted a data-driven approach to haptic texture modeling, which we will thoroughly explain and demonstrate during the tutorial. As shown in the image above, we use an instrumented handheld tool to record mechanical measurements (tool position and orientation, contact force, and high-frequency tool acceleration) during interactions with the surface of interest. The recorded signals are analyzed to generate a compact model of the high-frequency tool vibrations that occur at different combinations of tool speed and normal force, which we believe capture the essence of how the texture feels.

Rendering: The tutorial will also carefully explain our associated haptic texture rendering process, with demonstrations on a modified Wacom tablet (shown in the image above) and/or a SensAble Phantom Omni. As the user touches the virtual surface, the system uses real-time measurements of the tool speed and the normal force to calculate a high-frequency vibration signal very similar to what would be felt if the tool was moved in the same way across the real surface. This waveform can be displayed to the user via a voice-coil actuator attached to the tool or via the motors of a kinesthetic haptic interface. The resulting haptic virtual textures feel very similar to the real surfaces from which they were modeled.

Toolkit: Finally, this tutorial will introduce the Penn Haptic Texture Toolkit (HaTT), a publicly available repository for use by the research community. HaTT includes one hundred haptic virtual texture models created using the above methods, along with the original data from which the models were made, photographs of the surfaces, and code for rendering the textures on a SensAble Phantom Omni. We hope this repository will enable other researchers to use, test, and improve on the approaches we have developed for modeling and rendering haptic textures.

Katherine J. Kuchenbecker and Heather Culbertson

We enjoyed giving this tutorial at Haptics Symposium 2014. We have posted pdf's of both the agenda and the slides shown during the tutorial.  If you have any questions, please contact us at kuchenbe [at] seas [dot] upenn [dot] edu and hculb [at] seas [dot] upenn [dot] edu.