top of page

EgoCogNav: Cognition-aware Human Egocentric Navigation

Overview:

Modeling the cognitive and experiential factors of human navigation is central to deepening our understanding of human–environment interaction and to enabling safe social navigation and effective assistive wayfinding. Most existing methods focus on forecasting motions in fully observed scenes and often neglect human factors that capture how people feel and respond to space. To address this gap, we propose EgoCogNav, a multimodal egocentric navigation framework that predicts perceived path uncertainty as a latent state and jointly forecasts trajectories and head motion by fusing scene features with sensory cues. To facilitate research in the field, we introduce the Cognition-Aware Egocentric Navigation (CEN) dataset consisting 6 hours of real-world egocentric recordings capturing diverse navigation behaviors in real-world scenarios. Experiments show that EgoCogNav learns the perceived uncertainty that highly correlates with human-like behaviors such as scanning, hesitation, and backtracking while generalizing to unseen environments.

Research Team:

Zhiwen Qiu, Ziang Liu, Wenqian Niu, Tapomayukh Bhattacharjee, Saleh Kalantari

Year:

Publication:

Zhiwen Qiu, Ziang Liu, Wenqian Niu, Tapomayukh Bhattacharjee, Saleh Kalantari

8484425b977de02266ce9de4e0b63d7b.png

Design + Augmented Intelligence Lab

2427 Martha Van Rensselaer Hall
Ithaca, NY 14853
t. 607.255.3145

f. 607.255.0305

  • Vimeo Social Icon
  • YouTube Social  Icon
© Copyright Design and Augmented Intelligence Lab
bottom of page