Bringing virtual reality to people who are blind with an immersive sensory-based system

This post has been republished via RSS; it originally appeared at: Microsoft Research.

A man uses a VR white cane controller in an empty parking garage. Two small images in the upper right show a rendered overhead view of a room and a virtual white cane pointing at a yellow cube-shaped virtual object.

Virtual reality (VR) is an incredibly exciting way to experience computing, providing users with intuitive and immersive means of interacting with information that attempts to mirror the way we naturally experience the world around us. In the past few years, powerful VR systems have dropped in price and are on the verge of becoming mainstream technologies with potential uses in all kinds of applications. However, most VR technologies focus on rendering realistic visual effects. In fact, the hallmark of most VR systems is the head-mounted display that completely dominates a user’s visual field. But what happens if a VR user is blind? Does that mean that they are completely shut out of virtual experiences?

In this project, published in the paper “Virtual Reality Without Vision: A Haptic and Auditory White Cane to Navigate Complex Virtual Worlds,” we investigated a new controller that mimics the experience of using a white cane to enable a user who is blind to explore large virtual environments in the same way they navigate the real world—by using their senses of touch and hearing. Our paper has been accepted at the ACM CHI Conference on Human Factors in Computing Systems (CHI 2020) and received an Honourable Mention Award.

Making a white cane for the virtual world with users’ needs in mind

White cane users undergo intensive training to effectively use their canes to navigate and explore the world. They learn to hold the cane differently for different situations, listen to the sounds generated as the cane taps or sweeps along the ground and obstacles, and feel subtle changes in vibrations as the cane encounters different materials. By combining this experience with their other senses (sound, smell, and touch), they can use their cane to effectively navigate their environment.

In 2018, we introduced the concept of a haptic white cane controller for VR in a paper that demonstrated how users who are blind could utilize their skills with a white cane to explore a small virtual space. This year, we expand on this work to make the controller more natural, allowing for immersive navigation of large, complex environments comprising multiple rooms. The controller is mounted to a harness that users wear around their waist, and they can then hold the controller like they would an ordinary white cane. This allows them to use the mobility and orientation skills that they’ve learned for the real world to navigate a virtual world, using the virtual cane to detect walls, doors, obstacles, and changes in surface textures. See Figure 1 below for a detailed breakdown of the controller’s components.

A woman wearing the white cane VR controller. It comprises headphones, a support harness, and the controller with the various sensors, brakes, and actuators. An inset illustrates three different styles of grips supported by the controller.

Figure 1: Left: Components of our navigation cane controller. The controller renders force feedback in three orthogonal axes of motion, tactile feedback through a voice coil actuator, and spatialized audio effects through stereo headphones. 6-DOF trackers on the headphones and cane localize the user in virtual space, and the belt fastens our controller to the body. Upper right: People who are blind use different white cane grip styles based on need and preference. Our controller accommodates various styles. A) Traditional cane grip centered-high, B) pencil cane grip centered-low, and C) standard cane grip centered-low.

Putting the pieces together: How the controller emulates a real-world environment

Our controller uses a lightweight, three-axis brake mechanism (controlling dimensions of movement side-to-side, up-down, and forward-backward) to provide users the general shape of virtual objects. Each of the braking mechanisms has a unique construction that enables it to address different needs (see our paper for in-depth explanations of each of these). In Figure 2, we show how one of these brakes operates using a coiled cord to provide tension, and further details of how the brake utilizes friction using a capstan can be found in Figure 3. The flexibility of the three-axis system enables people to adapt the controller to different grips, depending on the context of use.

A coiled braking mechanism, with an arrow showing that it rotates from side to side.

Figure 2: Horizontal axis brake. One of three different braking mechanisms used in the device. The arrow shows the axis of motion. This mechanism consists of a capstan with a helixwound cord that, when either side of the cord is tensioned, can render high output forces bi-directionally.

Schematic illustration of the capstan brake mechanism, including the solenoid actuators and cord wound around capstan.

Figure 3: Capstan brake mechanism in the horizontal axis brake. Two solenoid-actuated “shoes” (upper left and lower right) exert a small friction to the cord against the capstan to prevent rotation.

In addition to braking of the cane movement when it collides with a virtual object, we mounted a multifrequency vibrator to the controller to mimic the high frequencies felt when the cane rubs against different textures. The controller feels and sounds differently depending on the texture of the surface the virtual cane encounters. When you drag a cane across concrete, it sounds and feels very different from when you drag it across a wood floor or carpeting, and the controller mimics this experience. Finally, we provide 3D audio that is based on the geometry of the environment using Project Triton, technology developed at Microsoft Research. With this capability, a radio playing around the corner in another room sounds as if it’s coming from that location and traveling around a corner.

Putting all these components together, our controller allowed users who are blind to effectively explore a complicated virtual world of 6 meters by 6 meters to play a scavenger hunt game, locating targets and avoiding obstacles and traps. In user testing, we found that seven out of eight users were able to play the game, successfully navigating to locate targets while avoiding collisions with walls and obstacles (see Figure 4 for details).

A person uses the white cane controller to navigate a virtual environment. The virtual environment contains objects that can be detected by the controller.

Figure 4: A) A participant navigates through the experimental game using the prototype haptic controller. B) A rendered first-person view of the virtual environment. C) Overhead map of the virtual environment with participant and cane (represented by a blue sphere with line).

Creating a white cane for the virtual space does come with challenges that we did not anticipate. For example, there are several types of white canes commonly used by people who are blind. These vary in weight, stiffness, and the kinds of tips they use. Some people prefer nylon or roller tips that easily glide over the ground, while others prefer the enhanced sensitivity of a metal tip. In our research, the sounds and feel of our controller were based on a carbon fiber cane with a metal tip. Users who were accustomed to the feedback from a metal-tip cane in the real world could easily identify the experiences they were having in VR. However, people who used a nylon- or roller-tip cane had a harder time identifying VR objects and surfaces because the feeling and sounds were very different to what they were used to. In future work, we would like to provide users the ability to change the virtual tip and cane materials to match what they typically use in the real world.

Overall, we found that by using our system, we could provide users who are blind with a compelling VR experience through multimodal haptic and audio feedback. Our prototype system suggests that VR doesn’t have to be limited only to those who have certain capabilities. To be clear, our prototype controller is still a long ways off from being a commercial product and there are many obstacles that we must overcome before something like this would be ready for commercialization. However, as VR becomes more common, it is critical that we try to include as many people as possible in our designs. This project shows one way that we can make this a reality.

The post Bringing virtual reality to people who are blind with an immersive sensory-based system appeared first on Microsoft Research.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.