Neurobiopsychology investigates human embodied cognition at the intersection of neuroscience, biology, and psychology. One of the main issues addressed by the research of the neurobiopsychology group in Osnbrück is multisensory integration, e.g., how the nervous system combines information from different senses into a coherent perception. Another important topic we examine is sensorimotor coupling, e.g., how the sensory system interacts with the motor system. For this purpose, we study human perception, behavior, and physiology, using a variety of different methods. Perception is examined using psychophysics experiments and sensory augmentation in laboratory, natural, or virtual reality settings. Eye movements and spatial navigation are used to investigate behavior. Physiological evidence about brain processes is collected by EEG. Modern technologies allow investigating human cognition and behavior in the natural environment, using a combination of mobile eye tracking and mobile EEG measurements. In collaboration with our resarch partners, we also have the opportunity to use technology such as fMRI or MEG. These empirical studies are complemented by theoretical work based on computer simulations and conceptual philosophical analysis.
The results of our work are further developed and applied in various companies:
Can humans learn a new sensory modality?
Using a sensory augmentation device, the feelSpace belt, subjects who train in the natural environment receive information on magnetic north via vibrating elements onto the waist. This changes how space is perceived, while increasing trust in navigational abilities.
How does the level of visual information change the sampling behavior of an object?
We investigate the influence of eye movements on the perception and recognition of ambiguous objects. We demonstrate that action precedes perception, i.e., that there is a reverse sensorimotor coupling.
How does visual, vestibular, and kinesthetic information interact while moving in space?
In a fully immersive virtual reality setup, subjects perform a triangle completion task actively walking and turning. To study integration of multiple sensory modalities, we investigate brain activity using mobile EEG.
Can partners in a joint visual search task use tactile and auditory cues to exchange gaze information?
In a psychophysics experiment, we examine whether partners improve their performance through multisensory cueing, translating close links of their individual perception and action to a teamwork context.