Embodied Photography and Cinematography
April 23, 2026 (Thursday) 10:30pm-11:30pm
Embodied photography and cinematography require a robot to integrate aesthetic judgment, camera control, and scene understanding in real time. This talk presents a unified view of the problem, spanning how robots can learn photographic style from human demonstrations, how they can robustly control focus and exposure in challenging lighting conditions, and how these capabilities can enable new applications such as intelligent birdwatching. We first discuss imitation learning methods that capture photographer intent and composition. We then cover event-based autofocus and auto-exposure systems that maintain image quality under low light and extreme illumination. Finally, we explore open-world tracking and language-guided observation for targeted wildlife capture. Together, these directions suggest a future in which robots act as active visual partners, not just passive imaging devices.










