top of page

May I Help You? Predicting Help-seeking in Human-Agent Teaming in Challenging Environments

Overview:

In challenging environments, robots have potential to be especially helpful partners when humans become incapacitated. Wayfinding is a key task in which human partners often rely on subtle nonverbal cues to decide to offer their partners help, yet current AI agents lack this ability. We took a first step to address this gap by modeling help-seeking from multimodal behavioral signals, including locomotion speed, gaze patterns, head movement and body position, and electrodermal activity (EDA). To do so we created an underwater simulation in immersive virtual reality with systematically varied navigation difficulty. Seventy-two participants completed wayfinding tasks in this environment, and we used the collected behavioral data to train predictive models of help-seeking. Our results indicated that gaze scanning, head scanning, and locomotion slowdown were reliable indicators of impending help requests. This work demonstrates the feasibility of inferring help-seeking needs from natural behaviors, offering design implications for proactive AI assistance in high-stakes, real-world scenarios.

Research Team:

Tianqi Liu, Weiche Lin, Yejoon Yoo, Saleh Kalantari, Andrea Stevenson Won

Year:

Publication:

Tianqi Liu, Weiche Lin, Yejoon Yoo, Saleh Kalantari, Andrea Stevenson Won

8484425b977de02266ce9de4e0b63d7b.png

Design + Augmented Intelligence Lab

2427 Martha Van Rensselaer Hall
Ithaca, NY 14853
t. 607.255.3145

f. 607.255.0305

  • Vimeo Social Icon
  • YouTube Social  Icon
© Copyright Design and Augmented Intelligence Lab
bottom of page