Our work uses state-of-the-art methods including: psychophysics, eye tracking, electrophysiology (EEG), functional and structural brain imaging, computational modeling, Virtual Reality (VR), brain stimulation (e.g. TMS) and neuropsychological studies of patients with acquired cognitive, sensory and motor impairments (e.g. strokes) and neuro-degenerative illnesses (e.g. Parkinson’s disease).
The pain research team investigates brain mechanisms by which the conscious experience of pain (both experimental and clinical) is generated and how brain states influence pain-related behaviour.
Our aim is to identify brain mechanisms and biomarkers that can both predict outcomes from clinical interventions (for example, spinal cord stimulation or cognitive-behavioural treatments) and aid in the development of new treatments. Our strategy is to translate theory and evidence from cognitive neuroscience and pain psychology into the development of new clinical methods for improving the long-term prognosis of patients with chronic pain. We have expertise in the use of human experimental pain models and the application of electrophysiology and neuroimaging to analyse brain structure and pain processing.
We study the neurobiological and psychological foundations of low-level and high-level vision and attention systems in the human brain, using a wide range of techniques (psychophysics, eye tracking, EEG, f/MRI, brain stimulation (e.g., TMS), neuropsychological studies with patients (e.g. Glaucoma, Parkinsons, stroke patients) and other visual deficiencies. The aim it so translate basic vision research into non-invasive behavioural interventions. We also work with machine vision and robotics collaborators on the development of biologically-inspired robotics and machine vision applications.
Work in visual perception also includes the study of visual attention, spatial cognition and symmetry. In turn perception can lead us to understand visual preferences. More information in this area is available from the pages of the Visual Perception lab.
Multisensorial Research, Virtual Reality & Cortical Plasticity
How does the brain combine inputs from different modalities (vision, hearing and touch) and how is this information used for perception and action?
We study the mechanisms underlying integrative auditory-visual (AV) and visuo-haptic (VH) processes, e.g: is AV integration modulated by spatial, temporal and semantic factors? Do similar mechanisms govern haptic and visual object perception? To measure sensory integration we use explicit (reaction times; performance) and intrinsic measures (e.g. Body posture in VR environments). Our multisensorial research in conjunction with the state-of-the-art VR system at the Virtual Engineering Centre (VEC) has direct applications for learning and rehabilitation.
Virtual Engineering Centre
Our collaboration with the VEC has led to the integration of VR technologies and immersive environments in education and business. Watch an example of VR in action
Back to: Institute of Population Health