Protect me from what I want

EmotiTactor: Emotional Expression of Robotic Physical Contact

Ran Zhou

Advisors: Harpreet Sareen, Loretta Wolozin, John Sharp, Barbara Morris

Abstract

Physical touch is one of the most important channels of interpersonal communication, but we barely notice the haptic clues in our daily lives. The study of robot-human affective interaction has primarily focused on facial expressions and vocal interactions, but not touch. What if the robots being developed today and in the future had the ability to express their emotions to humans through physical contact? Could it be possible for humans to understand robots’ emotions through haptic interactions?

EmotiTactor is a research experiment that aims to improve the emotional expressiveness of the robot. A tangible interface (Fig. 1) is constructed to perform haptic stimulations for primary emotions through a series of machine tactors (tactile organs).

Figure 1: EmotiTactor: A robotic tator interface

Motivation

While studies have begun to explore emotional haptic interaction between humans and robots, there is no research that focuses on the specific emotions conveyed from robots to humans via touch. I am interested in experimenting with the active touch motion for robots and putting humans into the positions that receive the information and reflect the emotions. My goal is to let robots express themselves through tactile behaviors.

Research

Hertenstein et al. [1] proved that humans can decode the distinct emotions by communication via touch. They held a study that divided the participants into dyads and randomly assigned them to the role of encoder and decoder. The encoders needed to convey the emotion assigned to them by touching decoders’ forearm without any visual or vocal communication. The decoders needed to choose the emotion that they felt through the cutaneous stimuli on the “response sheet.” The result showed that anger, fear, disgust, love, gratitude, and sympathy could be decoded at above-chance levels. They also recorded the most commonly used tactile behaviors for each emotion. These research findings inspired me to explore the primary emotions from human-human touches utilized in the context of HCI to create human-robot touches.

Prototype

To study the emotional expression of robotic contact based on human-to-human touch research, I replaced the role of the encoder with the touches of robotic tactor. I made my first prototype (Fig. 2): a structure that fits the length of the average forearm at 18 inches. There are four machine tactors driven by servo motors inside the structure. Using the motion of the servo motor, each tactor aims to stimulate one or two of the tactile behaviors used the most for each target emotion in Hartenstein et al.’s study [1]. These tactile behaviors include: squeezing, trembling, patting, hitting, shaking, and stroking.

Figure 2: First interface prototype

Study

Next, I ran two rounds of studies with 10 participants for each round (study1: 2 males, 8 females with an average age of 24.6; study2: 4 male, 5 female, 1 other with an average age of 24.1) in order to test the haptic simulations for primary emotions. Each participant was given the background and learned about the protocol mentioned previously before participating.

On arrival, the participant sat at a table and was asked to wear noise reduction earphones [Fig. 3]. Participants placed their forearm into the machine through a hole at the bottom of the opaque foam board which separated them from the EmotiTactor on the other side. They were designed to preclude nontactile clues like movement, gesture, or the motor’s sound in communication [2]. Calibration is manually carried out before running the test to guarantee that the tactors indeed touch their arms. The machines then cycle through the functions according to a random order of the six emotions. After each function, participants were asked to select an emotion that connected to their understanding on a response sheet. For each round, the participants had the option to select one of seven response options: anger, fear, happiness, sadness, disgust, sympathy, and N/A.

The difference between the two rounds was that in the follow-up study, options were arranged in a randomized order on the response sheet to exclude the influence of human judgment due to the order in which people saw emotion options. In study 1, I found that many individuals had no former experience of being touched through a machine. Their understanding of the emotions was affected a lot by the emotions’ order. To avoid this interference factor, I added a process called overview to familiarize participants with the interface. The machine cycled through all of the haptic functions, but the participants were not needed to make responses during this routine.

Figure 3: Experimental scene
Figure 4: Response sheets
Figure 5: Participant was taking the test

Result

The results in table 1 indicate that humans can decode at least five emotions through robotic tactile behaviors (fear, disgust, happiness, anger, and sympathy). The decoding accuracy rates ranged from 40% to 100% (from study 2) which are significantly higher than above-chance levels (25% [3]). Participants were easily confused about sadness with sympathy, which was consistent with the previous study [1]. 

Table 1: percentage of decoding accuracy for target emotion

Design

After the study, I made a new robot with wood based on the experimenting results. It is a device that assembles all the tactors on a smaller scale base which fits the length from the wrist to the middle part of the forearm (Fig. 1, 5 x 8 inches). It was designed in this form because it is more portable and likely to be implemented in our daily life. The research has proved that humans can decode primary emotions through robotic touch alone. In real life, however, most of the interactions are created with a combination of signals from different channels. So I decided not to make the second prototype (fig. 6) into a blind box form that humans need to put their forearm inside, instead, it was designed in a comfortable scale and weight and nice visual style. Humans could interact with it whenever they want. It will not even affect people typing on the keyboard while putting their forearms on it.

Figure 6: EmotiTactor Robot

Implementation of Tactors

Tactor 1: Fear & Sadness

The tactor was designed in the shape of two bending fingers driven by SD 90 servo motors (torque: 2.5 kg-cm). It can squeeze the human’s forearm and behave as trembling (for fear) and stroking (for sadness).

Figure 7: Tactor 1

Tactor 2: Happiness

The tactor was designed in the shape of a grasping hand, which was controlled by an SD 90 servo. It behaved as swinging and shaking to mimic happiness emotion.

Figure 8: Tactor 2

Tactor 3: Disgust

The tactor was driven by a servo motor that had higher torque (3.2 kg-cm). It can push the hand from the side. Since this emotion and tactile behavior were strong, I programmed it with a high range motion.

Figure 9: Tactor 3

Tactor 4: Anger & Sympathy

The tactor was designed in the shape of a palm driven by SD 90 servo. It had a larger contact area than other tactors. It can behave as hitting (for anger) and patting (for sympathy).

Figure 10: Tactor 4

Mechanism and Algorithm

These line graphs visualize the servo motor’s motion for the corresponding tactor in order to mimic the tactile behavior for the individual emotion. For example, the fear function had a slightly random motion which can give people a tactile sensation of trembling. And for anger, the function was programmed with high frequency, large range, and randomness while the sympathy function was gentle and regular. All these functions were designed and programmed based on the most frequent types of touch for each emotion provided by Hertenstein et al.’s study [1].

Figure 11: Graph for the servo motor’s function for target emotions

Future Work and Application

EmotiTactor has great potential to be implemented in a variety of contexts. It can be applied to the remote social touch while humans communicate remotely through VR devices. It can also provoke the communication between medical robots with humans, especially people with hearing and vision disorders. The following are some potential applications of EmotiTactor.

EmotiTactor can be applied to the design of the theatre chair. The armrest of the theater seat will touch you while you are watching the movie, it’s emotion will change with the story in the film.

Figure 12: Expressive theatre chair

Touch-you cushion (Fig. 13(A)) is an interactive toy that can accompany you. It may pat and convey sympathy to help you fall asleep and wake you up by the happiness function. Timid watch belt (Fig. 13(B)) will show its fear when the smartwatch is running out power. Strict lid (Fig. 13(C)) can be implemented to bottle or jar. It may punish you by anger emotion because you are too impatient that want to drink the super hot water or take too many candies. The implication of EmotiTactor helps our everyday objects become expressive and interactive, which can make our lives more colorful.

 

References

[1] Matthew J. Hertenstein, Dacher Keltner, Betsy App, Brittany A. Bulleit, and Ariane R. Jaskolka. 2006. Touch communicates distinct emotions. Emotion 6, 3 (2006), 528–533. DOI:http://dx.doi.org/10.1037/1528-3542.6.3.528

[2] Amol Deshmukh, Bart Craenen, Alessandro Vinciarelli, and Mary Ellen Foster. 2018. Shaping Robot Gestures to Shape Users’ Perception: The Effect of Amplitude and Speed on Godspeed Ratings. In Proceedings of the 6th International Conference on Human-Agent Interaction (HAI ’18). Association for Computing Machinery, New York, NY, USA, 293–300. DOI:https://doi.org/10.1145/3284432.3284445

[3] Mark G. Frank and Janine Stennett. 2001. The forced-choice paradigm and the perception of facial expressions of emotion. Journal of Personality and Social Psychology 80, 1 (2001), 75–85.           DOI:http://dxdoi.org/10.1037/0022-3514.80.1.75

[4] Jinsil Hwaryoung Seo, Pavithra Aravindan, and Annie Sungkajun. 2017. Toward Creative Engagement of Soft Haptic Toys with Children with Autism Spectrum Disorder. In Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition (C&C ’17). Association for Computing Machinery, New York, NY, USA, 75–79. DOI:https://doi.org/10.1145/3059454.3059474

[5] Sachith Muthukumarana, Don Samitha Elvitigala, Juan Pablo Forero Cortes, Denys J.C. Matthies, and Suranga Nanayakkara. 2020. Touch me Gently: Recreating the Perception of Touch using a Shape-Memory Alloy Matrix. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–12. DOI:https://doi.org/10.1145/3313831.3376491

 

Figure 13: (A) Touch-you cushion (B) Timid watch belt (C) Strict lid

Publication

EmotiTactor: Emotional Expression of Robotic Physical Contact is accepted for publication and presentation in the Provocations and Work in Progress track for the 2020 ACM Conference on Designing Interactive Systems (DIS ’20)

 

 

 

Back to Top
About
Ran Zhou is a creative technologist and researcher with a background in Architecture.
She is now devoted to exploring the emotional communication between robots and humans through haptic interaction.