- Abstract
-
While artificial intelligence is rapidly advancing, conventional robotic navigation continues to struggle to replicate the sophisticated spatial abilities of animals. This presentation will showcase the Neurocybernetic team’s work at ETIS lab, exploring a Neurorobotics-based approach – a field at the intersection of robotics and neuroscience – to address these limitations. Specifically, we’ll investigate visual navigation – a process largely supported by a key suite of brain structures known as the hippocampal formation, where researchers have identified specialized cells with remarkable spatial properties. These cells – including head-direction cells, place cells, and grid cells – are considered the neural substrate underpinning this ‘brain GPS’ system.
However, fundamental questions remain regarding how this system operates. How are internal (idiothetic) and external (allothhetic) information sources integrated? And how do the diverse types of spatial cells interact to support the development of various navigation strategies? While computational neuroscience models are attempting to address these questions, they often rely on data from controlled, small-scale experiments, limiting our understanding of cell activity in more complex, natural environments. Robots offer a valuable opportunity to test these models in the real world, potentially revealing novel control architectures for mobile robotics should they succeed. - About the speaker
-
Nicolas Cuperlier received a Ph.D. in Computer Science from CY Cergy Paris University in 2006. From 2006-2007, he conducted research on active vision and attentional processes at the LIMSI laboratory (CNRS, University of Orsay). Since 2008, he has served as an assistant professor of the Neurocybernetic team at the ETIS lab. He currently is the head of this team. His research interests encompass visual perception, sensorimotor learning, and the interplay between emotion and cognition – particularly for assessing robot self-assessment and Human-Robot Interactions. More specifically, his work focuses on spatial cognition and navigation behaviors in both biological and robotic systems, utilizing a neurorobotic approach to embody and evaluate bio-inspired models on mobile robotic platforms. Currently, he investigates the intersection of spatial cognition and emotion, designing and evaluating control architectures for self-driving vehicles operating in large outdoor environments