The sensory world is a “buzzing, booming” confusion of information. The brain processes different types of information in different areas and reunites the information to produce the impression of a seamless, integrated world. How the brain solves the “binding problem” is poorly understood. VPAL works to solve this problem by using multiple converging techniques. Understanding the neural basis of binding will help us to understand conditions in which binding fails e.g. autism, schizophrenia, Williams Syndrome, etc.

Infrared Eye Tracker


The first step to understanding the neural circuitry that mediates attention and binding is to observe and understand participants’ behaviour. We measure the accuracy and reaction time of participants’ judgments of sensory information. Recording their eye movements via infrared cameras also allows us to investigate how the brain extracts sensory information in real time.

Electro-encephalograph data



EEG measures synchronized activity across populations of neurons, with high temporal resolution, while participants are performing behavioural tasks. This technique lends itself well to a wide range of participants including children and special populations. fMRI compliments EEG data by providing high spatial resolution. Together they produce a clear picture of the brain activity underlying cognition.





Delving down to the next level, we analyze the responses of individual neurons to determine the algorithms by which they encode our perceptions. We then combine neuronal and EEG recordings to determine how activity across the network of brain areas controls attention and binds sensory information to produce our experience of a rich and vibrant world.

Spatial Attention

When we look around, we see objects in different places. The visual system uses spatial location to speed processing. Attention to a location can enhance visual processing there, such as when you're batting waiting for a pitch or a goalie ready to block a shot. You also use spatial attention to inhibit locations that are distracting or irrelevant to the task at hand.

We are investigating the different mechanisms involved in spatial attention. The results will aid in understanding attentional disorders such as neglect and ADHD.

Auditory-Visual Attention

Regions of the brain that process different features interact to produce a perceptual unit: the object. These features can be within a modality: such as the color and shape of a car (both visual), or across modalities such as the ticking of clock (auditory) along with the hands of the clock moving (visual).

This research investigates how we select features that cross modalities and bind them together. The results will provide insight into a number of clinical disorders where problems in binding occur (e.g. hallucinations), such as schizophrenia.

Eye-Hand Coordination

We do not only interact with the world by passively viewing it. We reach out and grasp it. We use our hands to manipulate objects in the world around us. Recent studies have shown that there are benefits to visual processing that occurs near the hand, both in normal subjects and patient populations.

We are investigating the circuitry underlying hand-deployed visual spatial attention. The results will aid in rehabilitation and treatment techniques for a number of clinical disorders including cognitive aspects of degenerative motor diseases (e.g. Parkinson's Disease), neglect, attentional disorders, and autism.

Object-based Attention

Objects are the fundamental units of both perception and action, allowing us to see and interact with the world around us. The brain takes the 2-dimensional image projected onto the retina and converts it, with apparent ease, into 3-dimensional, object-rich representations of the visual world.

We aim to understand object-based attention i.e. the preferential processing of an object of interest. Problems in object-based attention and feature binding are found in a number of neurological disorders such as autism and schizophrenia.

Representation of Social Information

We are investigating how people encode high level social information such as gender. For example, gender is present both in faces and in the way we walk. We see different patterns of activity when judging the gender of a face if it is preceded by another face, or by a human walking. This suggests that gender is not bound across these two types of representations.

Determining how we process and encode social information is important for understanding our social behaviour and how it breaks down as in autism.