Research Collection – Hands-on research and prototyping for haptics

This post has been republished via RSS; it originally appeared at: Microsoft Research.

While many of us think of human-computer interaction as a job for the eyes, ears and mind, we don’t think as often about the importance and complexity of our tactile interactions with computers. Haptics – the sense of touch or tactile sensations – permeates our computing experience, though we are often so habituated to it that it goes unnoticed. Consider the feeling of resistance and key travel that moderates your typing speed and confirms that your key presses are working, a palpable “click” on the mouse or touchpad that confirms an action, the patterns of vibration in your pocket or on your wrist to signal a notification or incoming call, or the subtle vibration you experience on some mobile phones when typing on a touchscreen keyboard.

These forms of feedback give us confirmation of our interactions with computers, but haptics also provide “feed-forward” to orient us to computer interfaces: the ridges that help your fingers find your keyboard’s home row without looking, the shapes and orientations of the buttons on your gaming controllers, or the camera “bulge” or physical buttons that help you know which way is “up” on your mobile devices when you fumble for them in the dark.

The addition of haptics also blurs the distinction between input and output devices – the most familiar example of this is the touch screen, which allows us to more directly “grasp” and manipulate on-screen objects. The rapid proliferation of touch screens also brings to light the extent to which we quickly acclimate to new tactile interfaces – note how some young children, having used tablets from a very early age, assume that any screen (or even a printed magazine) has touch capabilities, and is disappointed or frustrated when it doesn’t.

One of the earliest explorations of haptics in human-computer interaction was a rotary knob implemented in 1973 that functioned both as an input and output device: it could change its physical characteristics depending on what it was controlling at the time. For instance, it could vary its resistance from free-wheeling to locked, or “snap” back to an initial position when turned. This device was introduced alongside one of the first implementations of a capacitive touch screen, as a proposed means of controlling the CERN particle accelerator.

While haptics in computing could someday encompass the entirety of tactile sensations, particularly in virtual reality – from vibrations and textures you feel in your feet, to sensations of heat, cold or a light breeze on your arm – a great deal of research so far has focused on the hands. Our hands are the tactile interface for most of the computing technology we use (as well as for the physical tools we have used for thousands of years), and their importance to our survival is reflected in our physiology: 54 of the human body’s 206 bones are in our hands.

In virtual environments, further developments in haptics will enable a more realistic physical experience – while the illusion created by current VR experiences “breaks” when the user interacts with a virtual object that has only visual or auditory signals of its apparent weight, shape or texture, future haptic devices could simulate the heft of a rock in a virtual forest and the sense of momentum when it is thrown or caught, the roughness of a stone wall, the roundness and pliability of a rubber ball, or the recoil of a weapon fired in a game.

Microsoft researchers have been exploring haptics for many years, and their work has generally covered three overlapping areas: the psychology of perception (and the illusions we can apply to manipulate it), the development of prototype haptic input and output devices, and user studies on the effectiveness of haptic techniques. This research could be applied to compensate for or augment other senses (such as through the CaneController for people with visual impairments), or to create a greater sense of immersion and agency in VR and AR environments.

Explore more

This research collection brings together a number of prototype devices and associated research at Microsoft that relates to haptic interfaces for the hands. Our researchers have focused on two main aspects of theory and practice: building prototypes, and using those prototypes to understand the limits and interactions of haptics inside devices. This way they have found that haptics can be rendered asymmetrically (for example, playing with the simulated force to grab and release) and that simulations of touch can also create an “uncanny valley” of Haptics.

On the practice, the prototyping has focused on three basic interactions:

  • Palpation: The texture and responsiveness of a surface
  • Manipulation: The geometry of an object and its material qualities (such as rigidity)
  • Kinetic: The energy of an object and its momentum when thrown or caught

Explore more

NormalTouch / TextureTouch

NormalTouch and TextureTouch are mechanically actuated handheld controllers that render the shape of virtual objects through physical shape displacement, enabling users to feel 3D surfaces, textures and forces that match the virtual rendering. Both controllers are tracked with six degrees of freedom, and produce spatially registered haptic feedback to the user’s finger. NormalTouch haptically renders object surfaces and provides force feedback using a tiltable and extrudable platform. TextureTouch renders the shape of virtual objects including detailed surface structure through a 4×4 matrix of actuated pins. By moving our controllers around in space while keeping their finger on the actuated platform, users obtain the impression of a much larger 3D shape by cognitively integrating output sensations over time. Our evaluation compares the effectiveness of our controllers with the two de-facto standards in Virtual Reality controllers: device vibration and visual feedback only. We find that haptic feedback significantly increases the accuracy of VR interaction, most effectively by rendering high-fidelity shape output as in the case of our controllers. Participants also generally found NormalTouch and TextureTouch realistic in conveying the sense of touch for a variety of 3D objects.

Explore more

Haptic Revolver

The Haptic Revolver (Haptic Wheel) is a hand-held VR controller that renders fingertip haptics when interacting with virtual services. Its core element is an actuated wheel that raises and lowers underneath the finger to render contact with a virtual surface. As the user’s finger moves along the surface of an object, the controller spins the wheel to render shear forces and motion under the fingertip. The wheel is interchangeable and can contain physical textures, shapes, edges or active elements. Because the controller is spatially tracked, these physical features can be spatially registered with the geometry of the virtual environment and rendered on demand.


TORC is a rigid haptic controller that renders virtual object characteristics such as texture and compliance using the dexterity of the user’s fingers. Users hold and squeeze TORC using their thumb and two fingers, and interact with virtual objects by sliding their thumb on TORC’s trackpad. Vibrotactile motors produce sensations to each finger that represent the feel of squeezing, shearing or turning an object. Evaluation of the device showed that users could manipulate virtual objects more precisely with TORC than with conventional VR controllers.

Explore more


CLAW is a handheld virtual reality controller that augments typical controller functionality with force feedback and actuated movement to the index finger. The controller enables three distinct interactions (grasping a virtual object, touching virtual surfaces, and triggering) and changes its corresponding haptic rendering by sensing differences in the user’s grasp. A servo motor coupled with a force sensor renders controllable forces to the index finger during grasping and touching. Using position tracking, a voice coil actuator at the index fingertip generates vibrations for various textures synchronized with finger movement. CLAW also supports a haptic force feedback in the trigger mode when the user holds a gun. We describe the design considerations for CLAW and evaluate its performance through two user studies. The first study obtained qualitative user feedback on the naturalness, effectiveness, and comfort when using the device. The second study investigated the ease of the transition between grasping and touching when using our device.

Explore more


CapstanCrunch is a force resisting, palm grounded haptic controller that renders haptic feedback for touching and grasping both rigid and compliant objects in a VR environment. In contrast to previous controllers, CapstanCrunch renders human-scale forces without the use of large, high force, electrically power consumptive and expensive actuators. Instead, CapstanCrunch integrates a friction-based capstan-plus-cord variable-resistance brake mechanism that is dynamically controlled by a small internal motor. The capstan mechanism magnifies the motor’s force by a factor of around 40 as an output resistive force. Compared to active force control devices, it is low cost, low electrical power, robust, safe, fast and quiet, while providing high force control to user interaction. We describe the design and implementation of CapstanCrunch and demonstrate its use in a series of VR scenarios. Finally, we evaluate the performance of CapstanCrunch in two user studies and compare our controller with an active haptic controller with the ability to simulate different levels of convincing object rigidity and/or compliance.


PIVOT is a wrist-worn haptic device that renders virtual objects into the user’s hand. Its simple design comprises a single actuated joint that pivots a haptic handle into and out of the user’s hand, rendering the haptic sensations of grasping, catching, or throwing an object – anywhere in space. Unlike existing hand-held haptic devices and haptic gloves, PIVOT leaves the user’s palm free when not in use, allowing users to make unencumbered use of their hand. PIVOT also enables rendering forces acting on the held virtual objects, such as gravity, inertia, or air-drag, by actively driving its motor while the user is firmly holding the handle. When wearing a PIVOT device on both hands, they can add haptic feedback to bimanual interaction, such as lifting larger objects.

Explore more


GamesBond is a pair of controllers, each with four degrees of freedom, that create the illusion of being connected as a single device, despite having no physical linkage. The two controllers work together by dynamically displaying and physically rendering deformations of hand grips, allowing users to perceive a single connected object between the hands, such as a jumping rope.

Explore more

The post Research Collection – Hands-on research and prototyping for haptics appeared first on Microsoft Research.

REMEMBER: these articles are REPUBLISHED. Your best bet to get a reply is to follow the link at the top of the post to the ORIGINAL post! BUT you're more than welcome to start discussions here:

This site uses Akismet to reduce spam. Learn how your comment data is processed.