Perceptions and Movements in Collective Virtual Reality
Behavior is a fundamental property of living organisms. Individuals move in space, gather resources, mate, form collective structures. The individuals provide an adapted response to their environment by perceiving external stimuli, e.g., the direction of the light or the others' position, and internal stimuli, e.g., proprioception. The central problem of modeling is identifying functions that can predict individuals' behavior according to their perceived environment. A clear description of the environment is then critical. The recent advances in Virtual Reality (VR) allow us to investigate these questions by immersing individuals in a 3d virtual environment, where we can finely control each individual's visual field. This provides a unique opportunity to tackle vision in collective and individual behavior. I joined the CRI last year to design a general platform for studying behavior by the networking and automation of VR systems with two objectives in mind: i - Studying the relation between perception and movements ii - Providing an open platform for collective VR. I will present experiments where people interact with a unique object, discuss how this project has evolved with the ongoing pandemics and the shape this project will take in the coming future.