XRDS

Crossroads The ACM Magazine for Students

Sign In

Association for Computing Machinery

Magazine: Features
Designing for haptics

Designing for haptics

By ,

Full text also available in the ACM Digital Library as PDF | HTML | Digital Edition

Tags: Haptic devices

back to top 

Haptics is a diverse field of study. Psychologists have been probing our tactile perception for nearly a century [1], while force feedback haptic interfaces have a rich history going back decades—they were first developed for military and industrial purposes [2]. Today, the emergence of high-fidelity extended reality technology has brought with it an influx of interest from a broader community as to how we physically interact with and interpret the world around us. This curiosity, sparked by the desire to create good haptic experiences, is leading a new era of gamers, designers, professionals, and researchers to ask: "How can haptics improve user experience and realism in so-called extended reality (XR) systems?"

Although much of our perception is dominated by our vision, the way we act on the world is through our sense of touch. Haptics, or the bilateral "give-and-take" of forces and movement between our body and the environment, allows us to exert control and influence on the world. These complex interactions are sometimes taken for granted by adults—we learn to interact and manipulate the world around us at an early age until it becomes second nature. However, as users are beginning to dip their heads into virtual worlds, through the use of head-mounted displays and stereo headphones, the desire to incorporate their hands and bodies in the virtual worlds has become more imminent. Users are realizing how much they have taken haptics for granted, and how much they miss interacting with the world tactually. This has created a buzz in haptics communities (pun intended), as haptic interfaces have traditionally struggled to gain traction in modern consumer products. Virtual and augmented reality devices present a unique opportunity to design novel expressive interfaces that utilize haptics in an engaging way. First, however, let us take a brief look at where XR interfaces are today.

The current trend is to implement XR user interfaces through a mixture of motion-tracked controllers and traditional buttons and joysticks, but this may simply be an anachronism rather than the future way forward. These types of controllers are familiar, making them easy to pick up and use with a relatively short learning curve. Motion-tracked controllers have been widespread since the release of the Nintendo Wii, joysticks date to when the first electronic-controlled aircraft were invented, and the use of buttons dates back to, well, just about the invention of electricity itself [3]. These haptic interfaces are tried and tested and serve well as a starting point for great haptic XR experiences.

Yet one might wonder if these can be truly considered haptic feedback devices? Is haptic feedback not just about the vibration response, as has been common in controllers throughout the ages (see Figure 1)? In fact, even within the research community, the term "haptics" has been used in many different contexts. Sometimes it's only used to describe programmable tactile feedback. Other times it refers to large, interactive robots, which push and pull on their operator using motors, and mechanisms. It can also be used to describe motion sensing and actuation, such as haptic gloves that translate hand movement and action into VR [4], or to describe the clickity-clackity feeling of a mechanical keyboard (the audible sound also usually gets lumped into the experience). The truth is, all of these examples involve haptics, as haptics is pervasive in the field of human-computer interaction (HCI).

back to top  Mapping Haptic Interaction

To explain how all of these examples interrelate and fall under the umbrella of haptics, it is useful to introduce a framework. This framework serves as a map of the haptic space in the context of human-computer interfaces, and we have found it to be useful to guide areas that need to be addressed, which are sufficiently developed, or, perhaps, which are unintentionally neglected. This framework is loosely borrowed from a (now) classic work on physical product design, The Psychology of Everyday Things by Don Norman [5]. After discussing this framework, we will dive a bit deeper into each section, and finally discuss an example haptic experience.

Our framework is called the Haptic Action Cycle (see Figure 2). It has two sides, the human and the computer, which communicate with each other through two paths: an input or execution path, and an output or feedback path. The human executes intention-based actions on the computer, typically through movement—pressing a button, flicking a finger, or turning the head. The computer senses this input, updates its internal state, and mediates a physical output response to the human using some type of actuator. This output can use a rendering algorithm to produce physical feedback, such as a pulsed vibration of a motor, or it can rely on natural feedback that is already built into the interaction mechanism, such as the snap of a button spring. The solid feel of a tablet screen as it is tapped can serve as the native tactile feedback; yes, even the natural feel of a touchscreen can be designed, just not programmable. This feedback is sensed by the user's tactile receptors and interpreted by their perceptual model. If the user's perception matches with what they intended to do, the cycle is complete. The cycle can then repeat, as the human and the computer continuously interact and communicate. It's worth noting that the cycle does not have to be complex—in fact complexity can lend to a more cumbersome and difficult interaction. For instance, clicking a button on a keyboard, or scrolling down on a page, can be an entire cycle. Many times the more natural the mapping and more responsive the feedback, the shorter and more successful the cycle.

These cycles happen all the time. They are the basis of our interaction with technology. All devices implement some subset of this interaction cycle, whether it be input, output, or both. Many times the feedback path is simply visual or auditory, yet almost always the execution path includes haptics. Interfaces usually sit dormant until we go to physically interact with them (one notable exception is the emerging case of voice-controlled interfaces). This prevalence of visual and auditory feedback means that when people mention haptic devices, they almost always mean haptic feedback devices. This is, indeed, where the majority of our time and effort at the Future Interfaces Group has been spent thinking about and designing haptic experiences, but it is by no means the whole story. Creating good haptic experiences means facilitating all parts of the haptic action cycle, not just the feedback path. If any one aspect of the cycle breaks down, the whole experience can fall flat. Now that we have got a map of the haptic landscape, let us explore each stage a bit deeper.

back to top  Input Sensing

First, we have the input side of the haptic cycle, which includes a wide array of sensors. As mentioned before, a current trend in XR devices has largely defaulted to using controllers of some kind, where the inputs come directly from the various buttons, joysticks, and pads and are deterministic. Users are comfortable with this method, as it builds on our interfaces with gaming consoles and remote controls, but we do not think of this method of interaction as the final format of user input. XR devices have yet to solidify on best practice input, and a range of possible sensors can be utilized. Directly sensing bodily movement is especially intriguing in XR. For head tracking, using IMUs in head-mounted displays is already a gold standard. Computer vision combined with machine learning is opening up body, hand, and face tracking using streams of visual data ranging from low-cost cameras [6] to systems using LiDAR [7].

Each of these methods offers some aspect of haptic input that can be sensed by the computing platform and acted on as part of the Haptic Action Cycle and ensuring the reliability and robustness of these methods can bring real gains to haptic interactions in XR. Take hand tracking for instance. Natural hand interactions cannot be attempted unless the system knows exactly the location and orientation of each part of the hand. A range of haptic sensing gloves, in both industry [4] and research [8] spaces, have sprung up to take on the challenge.

back to top  Output Feedback

On the flip side is the output side of the haptic cycle, which encompasses the body of work that deals with providing feedback to the human from the computer. Again, pretty much any form of delivering tactile sensations can be utilized for haptic feedback, and there are many ways to actuate these sensations. As a zany example, you can strap pigeons to your body that you electrically stimulate every time you click a button, leading to feeling the flapping of their wings, and that would be a valid Haptic Action Cycle. Realistically, however, research in this area has typically been within some well-established methods for haptic feedback. These are broadly broken down into two categories: kinesthetic and cutaneous.

Kinesthetic devices deal with delivering forces to the body. The main focus of the feedback is to apply constraints and mechanical forces to the musculoskeletal system. In our everyday life, we receive force feedback constantly. Walking along the ground, opening doors, and grabbing and lifting objects, these are all kinesthetic activities. The applicability of kinesthetic feedback is especially interesting and challenging in VR, as the virtual world is full of intangible visual material that our physical hand passes right through, breaking the realism of virtual reality. To fully stimulate a real environment, the objects in any VR scene cannot just be seen and heard but must be felt as well—when the hand is placed on a table, one should not be able to push through it. Because of the high force requirements, kinesthetic output devices tend to be powered by motors, brakes, and other robotic forms of actuators. These can be built into an exoskeleton structure that connects to major points of the body such as the fingertips [9], waist, feet, or wrists.


Good haptic experiences means facilitating all parts of the haptic action cycle, not just the feedback path.


In contrast, cutaneous devices deliver sensations directly to the skin. These devices are typically broken down by how the user interacts with the devices. Contact-based devices are those where your skin actually touches the actuator and receives feelings directly, like using vibration motors (see Figure 1) or electrical shocks. Mid-air haptic devices stimulate your skin at a distance using methods like fans or ultrasound phased arrays [10]. Thermal displays which can provide temperature control [11] are also lumped in the cutaneous category. Directly stimulating receptors in the skin typically require less energy than physically resisting human motion, so cutaneous feedback devices can be made smaller and more power efficient than kinesthetic ones.

back to top  Human Anatomy

Kinesthetic and cutaneous feedback devices are intended to interface directly with human anatomy. The human skin contains various mechanoreceptors that respond to external mechanical stimuli. Different types of receptors respond to low frequency pressure, mid and high frequency vibration, and skin stretch. Approximately 250,000 of these receptors are distributed throughout the body, with the highest densities on the fingertip and lips [12]. There also exist less well-known receptors at the base of hair follicles so that we can feel lighter sensations like wind, which can be relevant to XR. Other nerve receptors in the skin don't respond to mechanical sensation but are sensitive to pain or thermal cues. Finally, a range of proprioceptors in the muscles, tendons, and joints inform our kinesthetic system, telling us when it's easy or hard to move our bodies. It is important to remember that all haptic feedback must travel through one of these haptic receptor channels. These are the basic sensors we humans are equipped with to receive haptic feedback. Therefore, basic knowledge of the sensitivities of human anatomy can help designers focus their efforts on how to provide appropriate haptic cues for the experience at hand.

back to top  Environmental Renderings

The final important area in designing haptic XR experiences is figuring out the best way to render haptic effects given a certain input/output setup. This step usually comes near the end of the design process when the constraints of the sensors and actuators are set. Within these constraints, we aim to find the best possible rendering methods using the Haptic Action Cycle. This trade-off always exists in haptic feedback devices, as there is no haptic actuator that can perfectly play back haptic experiences (there is no 4k display equivalent in haptics). The goal then is to see what range of haptic effects can be elicited, how well users understand and like them, and how realistic or immersive the sensations are. This typically involves iterative user testing or formal psychophysical testing to determine what the final interface is capable of.

back to top  Designing with Haptics in Mind

With this framework in mind, there are numerous ways to design a Haptic Action Cycle loop between humans and computers. However, industry devices typically still think of haptics as an add-on to existing interfaces, rather than a design parameter that they account for. The way that we often expect visual and auditory feedback can and should be extended to haptics as well, and we can take advantage of the diverse range of actuators to design haptic sensations tailored to complement the system interactions. This specific design flow is something we tried to address in our CHI 2022 paper, "Mouth Haptics in VR using a Headset Ultrasound Phased Array," by creating an API that enabled the mouth haptic effects to be drag-and-dropped onto objects in Unity [10]. We specifically chose to emulate the various other parameters that Unity makes incredibly easy to set, like dragging audio files onto objects or dragging models into the scene, to draw a parallel between these design settings we take for granted and haptics. We thought of the overall system in terms of the Haptic Action Cycle's stages, with the computer being the VR headset (and the mouth haptics device we built onto it). The input came from the user's movement, with the haptic output event being triggered by either a haptic element coming into contact with the user's mouth, like rainfall, or the user bringing themselves to a haptic node, like leaning in to drink from a water fountain or using the controller to bring a mug to their lips (see Figure 3). While our interactions were ultimately designed around the suite of animations that we could showcase on our device, our hope for future XR developers is to consider haptics as one of the essential components of an XR experience.


Virtual and augmented reality devices present a unique opportunity to design novel expressive interfaces that utilize haptics in an engaging way.


back to top  The Future of Haptics

Of course, one of the barriers to haptic feedback accessibility in most systems is the lack of a standardized actuator. Although infinite haptic methods exist, hardware designers are not sure what actuators to incorporate into their computing devices because most haptic methods are specifically designed for certain sensations, and we want to design interfaces that are typically more general. The lack of generalizability is a huge barrier for tactile devices, which do not have the equivalent of a screen for visual stimulation or a speaker for audio stimulation. In recent years, the industry standard has been trending toward vibrotactile motors as an easily integratable, cheap haptic actuator, but the issue with this method is that it delivers the same vibration sensation every time, and thus lacks expressivity and range of stimulation. However, new and innovative haptic methods and devices coming from both industry and academia indicate that the field of haptics has never been more popular. We are both interested and hopeful for a bright future full of devices that we not only can look at and listen to but feel as well.

back to top  References

[1] Katz, D. The World of Touch (LE Krueger, Trans.). Erlbaum, Mahwah. NJ, 1989. (Original work published 1925.)

[2] Salisbury, J. K., and Craig, J. J. Articulated hands: Force control and kinematic issues. The International Journal of Robotics Research 1, 1 (1982), 4–17.

[3] Plotnick, R. At the interface: The case of the electric push button, 1880–1923. Technology and Culture 53, 4 (2012), 815–845.

[4] Varga, S. Haptic gloves for virtual reality and robotics. HaptX; https://haptx.com

[5] Norman, D. A. The Psychology of Everyday Things. Basic Books, 1998.

[6] Ahuja, K., Shen, V., Fang, C. M., Riopelle, N., Kong, A., and Harrison, C. ControllerPose: Inside-out body capture with VR CONTROLLER Cameras. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '22). ACM, New York, 2022, 1–13.

[7] Patil, A. K., Balasubramanyam, A., Ryu, J. Y., Chakravarthi, B., and Chai, Y. H. An open-source platform for human pose estimation and tracking using a heterogeneous multi-sensor system. Sensors 21, 7 (2021), 2340.

[8] Caeiro-Rodríguez, M., Otero-González, I., Mikic-Fonte, F. A., and Llamas-Nistal, M. A systematic review of commercial smart gloves: Current status and applications. Sensors 21, 8 (2021), 2667.

[9] Secco, E. L., and Tadesse, A. M. A wearable exoskeleton for hand kinesthetic feedback in virtual reality. In International Conference on Wireless Mobile Communication and Healthcare. Springer, Cham, 2019, 186–200.

[10] Shen, V., Shultz, C., and Harrison, C. Mouth Haptics in VR using a headset ultrasound phased array. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '22). ACM, New York, 2022, 1–14.

[11] Feelreal Multisensory VR Mask. FEELREAL Multisensory VR Mask; https://feelreal.com

[12] Corniani, G., and Saal, H. P. Tactile innervation densities across the whole body. Journal of Neurophysiology 124, 4 (2020), 1229–1240.

back to top  Authors

Craig Shultz is a postdoctoral researcher at the Human-Computer Interaction Institute at Carnegie Mellon University in the Future Interfaces Group. His research lies at the intersection of haptics, electromechanical systems, and human-computer interaction. Craig's work has won numerous awards at various haptics and HCI venues, and led to several patents on new actuation mechanisms. Before CMU, he was the VP of Research and Development at Tanvas, a novel electro-adhesive haptic touchscreen startup based on his Ph.D. research under Dr. Ed Colgate at Northwestern University.

Vivian Shen is a second-year Ph.D. student at the Robotics Institute of Carnegie Mellon University. She does human-computer interaction research with the Future Interfaces Group, advised by Dr. Chris Harrison. Her background is in embedded systems and computer perceptions, and she has recently been applying this knowledge to input modalities and haptic output systems. She is a Swartz Entrepreneurial Fellow, has an NSF Grant Honorable Mention, and is a recipient of two Best Paper Awards at premier venues in HCI.

back to top  Figures

F1Figure 1. Four controllers throughout the years (from left to right). N64 controller with an attachable Rumble Pak (1997), which vibrated the controller with visual game feedback. Wii Remote (2006) with motion sensing and built-in vibrations. XBox wireless controller (2013) with multiple joysticks and buttons and built-in vibrations. Oculus Quest 2 controller (2020), with motion sensing and built-in vibrations.

F2Figure 2. A visualization of the Haptic Action Cycle.

F3Figure 3. A user wearing our mouth haptics device leaning in to drink from a water fountain in VR. The input, the movement of the head, triggers the output, which is the feeling of the water on their lips from an ultrasound phased array, completing a Haptic Action Cycle.

back to top 

xrds_ccby.gif This work is licensed under a Creative Commons Attribution International 4.0 License.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.