Image
Journées BabyLabs
Les journées des Babylabs à Marseille, Campus d'Aix-Marseille Université
Les 17 et 18 Octobre !
Image
Photo_ITW_CChabbert_Beur.FM
Les vertiges : On en parle à la Radio avec Christian CHABBERT
Interview du 16 Septembre sur BEUR FM
Image
Micro AG
Séminaire 30 Septembre du GT Développement Durable de 11h à 12h Salle des Voûtes
Le GT Recherche soutenable & Echanges pour une 1ère vision partagée de la soutenabilité de la recherche au CRPN
Image
Photo Isabelle DAUTRICHE
Nouvelle ERC Starting Grant au CRPN
F.é.l.i.c.i.t.a.t.i.o.n.s. à Isabelle DAUTRICHE
Image
Photo Proprioception Ado
Recherche de participants pour une étude de la proprioception
Etude sur la perception du mouvement chez les adolescents
Image
Photo Hippolyte Gros
Bienvenue à Hippolyte GROS dans l'Equipe DEPHY
Image
Photo Toucher Caresse
Publi Frontiers in Aging Neuroscience
de Léonard SAMAIN-AUPIC, Mariama DIONE, Edith RIBOT-CISCAR, Rochelle ACKERLEY, Jean-Marc AIMONETTI "Relations between tactile sensitivity of the finger, arm, and cheek skin over the lifespan showing
Image
Photo_enfants_Souk_des_Sciences
Le Labo des Minots au Souk des Sciences
Mercredi 10 Juillet de 10h à 18h, Marseille Vieux Port - Accès libre et gratuit ! Distribution de matière grise, démonstrations, expériences, rencontres avec des chercheurs ...
Image
Article dans Nature Com d'Adrien MEGUERDITCHIAN et al.
"Planum Temporale asymmetry in newborn monkeys predicts the future development of gestural communication's handedness"
Image
Photo Set Up MH GROSBRAS
Nouvelle publi Nature Com - Scientific Reports
"Developmental changes of bodily self-consciousness in adolescent girls" Lisa RAOUL, Cédric GOULON, Fabrice SARLEGNA & Marie-Hélène GROBRAS
Image
Portrait de Eric CASTET

Les séminaires du Lundi accueillent Eric CASTET, Equipe Sense, Movement & Perception (SMP

le :   7 Octobre 

de : 11h à 12h 

En : Salle des Voûtes, Bâtiment 9, Campus Marseille St Charles

 

Title : PTVR – A software in Python to make virtual reality experiments easier to build and more reproducible

Researchers increasingly use virtual reality (VR) to perform behavioral experiments, especially in vision science. These experiments are usually programmed directly in so-called game engines that are extremely powerful. However, this programming process is tricky and time-consuming as it requires solid knowledge of game engines. Consequently, the anticipated prohibitive effort discourages many researchers who want to engage in VR.

In this seminar I will present the Perception Toolbox for Virtual Reality - PTVR (https://ptvr.inria.fr) with which visual perception studies in VR can be created using high-level Python script programming. A crucial consequence of using a script is that an experiment can be described by a single, easy-to-read piece of code, thus improving VR studies' transparency, reproducibility, and reusability (cf. « FAIR » principles). This new toolbox should also dramatically reduce the difficulty of programming experiments in VR and elicit a whole new set of visual perception studies with high ecological validity.

As an illustration of the PTVR power and scientific relevance, I will present some PTVR experiments currently run in our group to assess the performance of low vision persons in sophisticated pointing tasks. These tasks rely on the rich interactivity available in PTVR and also aim at improving the visual performance of these patients.


 

Actualités