Talent.
A novel HRI study on the effects of robotic eye-gaze mirroring.
Exploring the Impact of Eye Gaze Mirroring in Human-Robot Interaction
Imagine sitting across from a robot—let’s call it Talent. As the conversation flows, you glance to the side, and Talent’s "eyes" follow yours. Did you notice? How did it make you feel?
Eye gaze is a cornerstone of human communication, deeply tied to comfort and engagement. But what happens when this nonverbal cue comes from a robot? Our team set out to explore this question through a novel study on human-robot interaction. Our goal was to contribute to a deeper understanding of how robots can use subtle, human-like behaviors to foster connection and trust in various contexts.
Over 10 weeks in Carnegie Mellon University’s Human-Computer Interaction course, our team—Gerry D'Ascoli, Michelle Lu, Sophia Timko, and myself—designed and experimented to investigate how a robot’s eye gaze mirroring influences user experience. My role as a researcher encompassed:
Conducting research to inform our study design and contextualize our findings within existing literature.
Developing the protocol to structure the experiment, ensuring consistency and reliability in participant interactions.
Creating the task script and facilitating “Talent” the robot’s conversation using a Wizard of Oz method, where human input subtly controlled the robot’s responses to maintain experimental conditions.
Results.
Eye Gaze Mirroring Creates Positive Impressions
Although the results were not statistically significant, the findings suggest that eye gaze mirroring holds promise for enhancing human-robot interactions. Participants in the experimental condition, where the robot mirrored their eye gaze, perceived the interaction as more natural and positive than the control group.
In post-experiment surveys, participants expressed greater agreement with positive statements and less agreement with negative ones about their experience with the robot in the experimental condition. Moreover, their descriptive feedback included more positive words and fewer negative words, highlighting a generally favorable impression of the robot.
These trends indicate that eye gaze mirroring may contribute to creating a more engaging and comfortable user experience, warranting further exploration in future studies
Subjective Metrics
To evaluate participants’ subjective experiences, two-sample paired t-tests were conducted on responses to all eight survey questions comparing the experimental and control conditions. While none of the results reached statistical significance at the p < .05 threshold, further analysis using sign tests revealed notable findings.
Participants rated the experimental condition, where the robot mirrored their eye gaze, as more sociable (p = .031) and more engaged (p = .035) compared to the control condition. However, participants also perceived the control condition as demonstrating better listening (p = .035) than the experimental condition.
These insights suggest that while eye gaze mirroring may enhance perceptions of sociability and engagement, it could also influence other aspects of interaction dynamics, such as the perception of attentiveness.
Objective Metrics
Objective data was collected using PupilLabs eye-tracking glasses to evaluate participants’ visual attention and behavior during the interactions. Four metrics were analyzed, and two-sample paired t-tests were performed to compare the experimental condition (with eye gaze mirroring) and the control condition.
The analysis revealed no statistically significant differences between the two conditions (p < .05) for any of the four metrics.
While the objective results did not yield significant findings, they provide a foundation for refining measurement techniques and exploring nuanced effects in future research.
Background.
The Importance of Eye Gaze in Human-Robot Interaction
Eye gaze plays a critical role in human communication as it conveys attentiveness, facilitates interaction, and enhances understanding. As a key component of nonverbal communication, it helps build rapport, trust, and connection between individuals. In the context of robots interacting with humans, mastering social skills like eye gaze is essential for creating more natural, engaging, and effective human-robot interactions.
In the early stages of our project, we conducted an extensive review of existing research on topics such as robot mimicry, eye gaze, eye tracking technology, nonverbal communication, and trust in robots. This foundational research guided the design of our study. For example, we chose to equip our robot with round eyes and large irises, following previous research (Onuki et al., 2013) that identified these features as the most "friendly" in terms of human perception. Additionally, we incorporated key findings from studies that showed imitation (Shimada et al., 2008) and direct eye gaze (Babel et al., 2021) create more favorable impressions in human-robot interactions. These insights played a pivotal role in shaping the approach of our experiment and helped inform the design decisions for the robot's eye gaze behavior.
A within-subjects study design
Hypothesis: Human comfort levels will increase with a robot that mirrors their eye gaze.
Variables:
Independent: Gaze pattern
Dependent: Comfort level
Objective Metrics:
Length of eye contact
Length of gaze aversion
Polar coordinates of gaze aversion
Frequency of gaze aversion
Frequency of blinking
Subjective Metrics:
Likert scale sentiment ratings (comfort, engagement, difficulty, etc.)
Number of positive/negative comments in comment analysis
Participants:
10 participants, ages 22–30
6 male and 4 female (self-identified)
Representing three graduate programs (MHCI, MBA, MRSD)
Interests include UX design, UX research, robotics, and product management roles.
Protocol.
Study Protocol Overview
While existing research has explored many factors similar to ours, our study introduces novelty by isolating mirrored eye gaze while keeping other variables constant. Unlike previous studies that examined eye gaze alongside other nonverbal cues like neck movements, head tracking, and posture, we focused solely on its effects. Additionally, while most research uses trust as the dependent variable, we uniquely assessed comfort level as our subjective metric.
Participants complete a pre-study questionnaire with demographic information and give consent.
Participants calibrate eye-tracking glasses by introducing themselves to Talent, the robot.
Talent begins part 1 of the interview with participants randomly assigned to start either the control condition (no eye gaze mirroring or the experimental condition (eye gaze mirroring).
The participant completes the post-study questionnaire for part 1.
Talent begins part 2 of the interview with the alternate condition (experimental vs. control).
Participant completes the post-study questionnaire for part 2.
Task Script.
Task Script Development
As the primary designer of the task script, my responsibility was to create a structured conversation between "Talent," the robot, and the participant, or the "job seeker." The goal was to simulate a natural, engaging interaction, mimicking a real job interview scenario.
To accomplish this, we used VoiceFlow, a conversational assistant design platform, to develop several conversational paths. These paths allowed for dynamic and adaptable conversations, ensuring the study could explore different interaction scenarios. However, since “Talent” did not have autonomous speech generation, we employed a Wizard of Oz technique, where a human facilitator controlled the robot’s responses behind the scenes.
The task script was divided into three segments, each containing common interview questions. These segments were designed to elicit responses that would allow us to assess how eye gaze mirroring, along with other social cues, might impact participant perception and engagement during the conversation. This structured approach ensured that we could control for external variables while focusing on the specific dynamics of eye gaze and interaction quality.
-
The first segment served as a general introduction and allowed for a baseline capture of eye gaze data from the participant to calibrate the eye-tracking glasses.
-
Segments two and three, designed to follow the natural cadence of an interview, prompted users to answer questions. These served as the basis for the control and experimental groups.
-
Throughout the conversation, “Talent” would respond to the participants with utterances such as “That’s a great school!” and “Thank you for sharing” to more closely mimic the natural exchanges of conversation.
-
Using VoiceFlow, we were able to build multiple conversation paths in advance and select the path unique to the participants desired job title, while remaining consistent in the questions asked.
Robotics System Development.
Experiment Software and Technical Setup
The experimental software, developed by Gerry D'Ascoli and Michelle Lu, was composed of three separate Python scripts that worked together to collect and analyze the data.
Eye Tracking Interface: The first script interfaces with the Pupil Labs Pupil Core glasses, which were used to capture the participant’s eye gaze data. This script ensured that real-time eye movement tracking could be synchronized with the robot’s actions during the interaction.
Interactive Display: The second script, developed using Pygame (after Pyglet was found to be incompatible with our technical requirements), handled the visual interaction between the participant and the robot. This script ensured that the robot's gaze and other behaviors were displayed appropriately in response to the participant’s actions.
Data Collection and Analysis: The third script focused on collecting the models built up during each trial based on the subject’s gaze. It processed this data and exported it into a CSV file, which allowed for subsequent analysis of the gaze patterns and interaction dynamics.
Pupil Core Interface
The Pupil Core interface plays a critical role in capturing and processing the participant’s eye gaze data. The interface connects to Pupil Capture, a software designed to receive and manage the eye tracking data from the Pupil Core glasses.
Robot Animation
The robot animation script was built in Pygame. It is essentially two images overlayed, one of the robot and one of its pupils. The eyes moved based on input polar coordinates (r,θ)
Robot Design: Robot versus Computer Graphics
After exploring various robot options in the AI Maker Space at Carnegie Mellon University, we ultimately decided to represent our robot using computer graphics rather than a physical robot. This decision was driven by limited access to robots with eyes that could be manually programmed to mirror gaze behavior in real time, a key element of our study.
By opting for a computer-generated version of the robot, we were able to maintain control over the eye gaze dynamics, ensuring that the robot's eye movements could be precisely aligned with the experimental conditions. The graphical representation allowed us to simulate a realistic interaction while overcoming the logistical constraints of physical robot manipulation. This approach also offered greater flexibility in refining the robot's appearance and behavior without being limited by hardware capabilities.
Survey Design.
Post-Experiment Survey Design
Designed by Sophia Timko, the post-experiment survey aimed to capture participants’ subjective experiences following each trial. The survey was based on a 5-point Likert scale, with many questions derived from the Robot Sociability and Affinity Scale (RoSAS), which assesses comfort in human-robot interactions.
To manage bias and ensure balanced responses, the wording of the survey was carefully crafted so that half of the questions were framed positively and the other half negatively. This approach helped minimize the influence of response biases, ensuring that participants’ feedback reflected a more accurate representation of their experience.
Additionally, to allow participants to elaborate on their thoughts and provide feedback beyond the structured survey questions, we included three open-ended questions. These allowed participants to express their opinions on aspects of the interaction that may not have been captured by the Likert scale questions, offering valuable insights into the factors influencing their perceptions of the robot. This open-ended section provided a deeper understanding of how eye gaze mirroring and other variables shaped participants’ experiences and emotional responses during the interaction.
Limitations.
Limitations and Influencing Factors
Several factors may have influenced the participants’ experiences and the outcomes of our study:
Novelty Effect: The eye-tracking glasses, while essential for capturing gaze data, introduced a sense of novelty among participants. Comments like “The glasses are so cool!” were common, suggesting that the technology itself may have distracted participants or influenced their behavior. This likely led to unnatural postures and mannerisms, affecting the natural flow of interaction and skewing the results.
Environment: The presence of all group members during the study and the job interview-themed conversation likely contributed to increased pressure and discomfort for some participants.
Technical Difficulties: Technical issues with the eye-tracking glasses during certain trials may have impacted the participants' overall experience. Disruptions in gaze tracking could have introduced inconsistencies, potentially influencing both the objective metrics and the participants' perceptions of the robot’s attentiveness.
Simplified Software: In an effort to streamline the data collection process, we had to discretize aspects of the gaze data rather than develop a more generalized gaze model. While this simplification allowed us to move forward with the experiment, it may have limited the system's natural behavior and potentially affected the robot's ability to mirror gaze in a truly fluid, dynamic way.
These factors highlight several challenges and constraints that may have influenced the results..
Future Direction.
Future Improvements and Direction
Several areas for improvement were identified during our study, which could enhance the effectiveness and realism of future experiments:
Experiment Autonomy: One key improvement would be to develop a more autonomous experiment, eliminating the need for the Wizard of Oz technique, where a human facilitator controls the robot’s responses. Achieving full automation would enhance the natural flow of the interaction and reduce potential bias introduced by human intervention, leading to more reliable results.
Improve Gaze Recognition: To capture more accurate and natural user data, gaze recognition technology could be improved. Enhancing the precision of gaze tracking and the ability to detect subtle eye movements would allow for more realistic and dynamic gaze mirroring. This improvement could also help overcome some of the challenges posed by the novelty effect and technical difficulties we encountered in the current study.
Simplified Software: Replacing the computer graphics with a physical robot would improve the realism of the experiment. This would streamline the software by removing the need for complex graphical interfaces, while also offering a more authentic experience for the participants. A physical robot would provide a more accurate simulation of human-robot interactions, allowing for better insights into the effects of eye gaze mirroring in real-world settings.
These improvements could further refine the study, making it more scalable, realistic, and effective in exploring the impact of eye gaze mirroring in human-robot interactions.