
Xavier Lagorce is a researcher working on next-generation vision sensors and intelligent perception systems. His work focuses on event-based vision, a new sensing paradigm inspired by biological vision that captures changes in the visual scene with extremely high temporal precision.
His research sits at the intersection of computer vision, embedded systems, and sensor design, with a strong emphasis on building complete working systems—from sensing hardware to algorithms and real-time processing pipelines. He enjoys tackling problems where theory meets engineering and where new sensing technologies open the door to entirely new ways of perceiving the world.
Xavier is particularly enthusiastic about working with students who like to experiment, build prototypes, and explore unconventional ideas in sensing and perception.
My research explores new ways for machines to see and understand the world, especially using event-based vision sensors.
Unlike traditional cameras that capture images at fixed frame rates, event-based sensors report only the changes occurring in a scene, producing a continuous stream of visual events with microsecond precision. This makes them especially powerful for applications involving fast motion, high dynamic range, or low-latency perception.
In our group, we work across the full stack of these systems, including:
Designing event-based sensing architectures and digital processing pipelines
Developing algorithms and machine learning methods for event streams
Building real-time perception systems
Exploring applications in robotics, autonomous systems, and smart environments
Students working with me often get the chance to combine algorithms, hardware, and system design, and to build experimental systems that push the limits of current sensing technology. If you enjoy hands-on research and want to explore new sensing paradigms, this is a great place to start.


沪公网安备 31011502006855号


