The Virtual Reality Facility (https://hnp.fcbg.ch/home/virtual-reality/at the Fondation Campus Biotech Geneva (FCBG) is part of the Human Neuroscience Platform, and provides researchers with state-of-the-art equipment and expertise in the field of immersive interaction and motion analysis in virtual reality for experimental research and clinical applications (e.g. cognitive and affective assessment, cognitive and behavioral therapy, neurological rehabilitation, gait and upper limb neuro-prostheses).

Job Description

The brain operates in a world where all things have a spatial location and most important things are objects. Tools, faces, buildings, words are some of the things with which we interact most often, and their meanings are defined by their precise spatial structures. Objects and space are tightly linked and cognitive functions, such as attention, must work on these rich and integrated sets of information to orient resources and guide successful behavior. How we perceive and use ego-centered space has been extensively well characterized. Instead, how the brain defines and uses object-centered space (i.e. within-object spatial coordinates) and attentively use it for behavior remains a deep mystery. This is due in part to the lack of a solid neuroscientific hypothesis about the neural underpinnings -which I aim to provide-, and in part to the lack of appropriate experimental approaches.

The functional activation of object-centered cognition has been isolated mostly with experimental paradigms characterized by the presentation of simple stimuli on a computer screen, repeated hundreds of times. This procedure allows the researcher to control multiple factors, but it impacts how processing resources are allocated to sensory information, often requires extensive training, and produces artificial strategic processes. These influences appear at odds with how cognition is allocated in real life situations: the visual input is dynamic, unrepeated, and cluttered with many distinctive objects, that most of the time are inextricably related.

Virtual Reality environments (VR) offer the best tool to manipulate object and space independently in a naturalistic manner. We will design a dynamic VR experiment where participants will perform a naturalistic search task. Subjects will be asked to make an object-centered, an ego-centered, or nonspatial judgments on the to-be-found objects. The task will be compatible with a mixed blocked and event-related fMRI study that will subsequently reveal the network supporting the three cognitive reference frames. We will record not only accuracy and reaction times, but also eye movements and pupil diameter, which will provide a window into the natural exploratory behavior of the subjects under the different cognitive conditions. Critically, the task will be fully parametrized to give the experimenter a psychophysics-like control on the parameters of interest. It will also be backed up by two existing traditional experiments I recently developed to overcome the difficulties in data interpretations.

This project will produce a novel, flexible VR-framework for the study of high-level object- and spacebased cognitive interactions. Three new dynamic naturalistic environments (a forest, a city park, and a city street) will be developed. An assortment (stimulus set) of visually controlled stimuli (two object categories: faces and houses) will be created. The environment will be organized in blocks (the path will run back and forth) to directly translate it into a blocked design fMRI experiment. The environment will be also event related (target event) for the measurements of time-resolved natural behavior. It will produce a routine for the calibration and measurement of eye movements and pupil diameter integrated in the task.

This environment will offer a number of interesting possible extensions in many domains: -investigation of additional brain functions (attention, working memory, memory etc.) -integration with additional behavioral measurements (motion capture, touch screen) -integration with additional brain measurements (EEG, TMS) -investigation of multiple populations (patients e.g. object-centered vs. ego-centered neglect patients, but also children with attentional deficits, animal models) If of interest, the researcher will happily share the analysis pipeline for the study of behavior (eye movements) for future users which is planned at three levels of complexity: simple accuracy, quantitative eye movements, and modelling of eye movements.

Project planning:

  • Design of the three scenes (preliminary versions can be provided)
  • Selection of the relevant objects and implementation of the code controlling appearance, visibility, timing, position etc.
  • Integration with eye movements and pupil diameter, including calibration procedures. Coding of the experimentally relevant variable to be stored as output.
  • Integration with fMRI triggers.
  • Collection of data from pilot subjects
  • (optional) integration of the task with active navigation (e.g. through joystick, keypress, bike)
Contact
vr@fcbg.ch
  • vr@fcbg.ch

Required profile

Profile required:

  • Mastery of the Unity engine
  • Mastery of the development language(s) (C#, C++)
  • A first VR experience
  • Knowledge of 3D software: Blender
  • Knowledge of real time optimization constraints
  • Knowledge of versioning tools (Git)
  • Excellent communication and teamwork skills
  • Fluency in English (oral and written)
  • Excellent autonomy

The "Plus":

  • Knowledge of eye tracking

 

The internship is for MSc level students performing their 5/6 months final research project in 2022. The position is full-time at FCBG in Campus Biotech.

Contact: vr(at)fcbg.ch

i-jobs

Other Jobs

Quality Assurance Intern

Position: Six (6) months master student internship
Posted: November 2021
Location: Wyss Center for Bio and Neuroengineering, Campus Biotech, Geneva Switzerland