May I Help You? Predicting Help-seeking in Human-Agent Teaming in Challenging Environments
Overview:
In challenging environments, robots have potential to be especially helpful partners when humans become incapacitated. Wayfinding is a key task in which human partners often rely on subtle nonverbal cues to decide to offer their partners help, yet current AI agents lack this ability. We took a first step to address this gap by modeling help-seeking from multimodal behavioral signals, including locomotion speed, gaze patterns, head movement and body position, and electrodermal activity (EDA). To do so we created an underwater simulation in immersive virtual reality with systematically varied navigation difficulty. Seventy-two participants completed wayfinding tasks in this environment, and we used the collected behavioral data to train predictive models of help-seeking. Our results indicated that gaze scanning, head scanning, and locomotion slowdown were reliable indicators of impending help requests. This work demonstrates the feasibility of inferring help-seeking needs from natural behaviors, offering design implications for proactive AI assistance in high-stakes, real-world scenarios.
Research Team:
Tianqi Liu, Weiche Lin, Yejoon Yoo, Saleh Kalantari, Andrea Stevenson Won
Year:
Publication:
Tianqi Liu, Weiche Lin, Yejoon Yoo, Saleh Kalantari, Andrea Stevenson Won








