Skip to main content Skip to local navigation

Centre for Vision Research: How quarterbacks' brains control their hand-eye coordination and allow split-second plays

Centre for Vision Research: How quarterbacks' brains control their hand-eye coordination and allow split-second plays

New research from York University is the first to show how several distinct brain areas control eye and hand movements – explaining, for example, how a quarterback can make a split-second play with pinpoint accuracy.

The study, recently published in The Journal of Neuroscience, examined the inner workings of the posterior parietal cortex (PPC), located towards the top and back of the skull. It acts as the brain’s game card for hand-eye coordination, playing a critical role in planning visually guided actions.

Above: Professor Doug Crawford performs computer-controlled tests to measure the accuracy of Pat Byrne's gaze and reach. Byrne, a postdoctoral Fellow working in York's Centre for Vision Research, is hooked up to eye-tracking headgear.

“Football is a good example to illustrate our results. A quarterback trying to deke out the opposition would actually use separate parts of the posterior parietal cortex in rapid succession...to achieve this,” says principal investigator Doug Crawford, professor of psychology in York’s Faculty of Health and Canada Research Chair in Visuomotor Neuroscience.

The findings suggest that within the PPC, the superior parietal occipital cortex (SPOC) specializes in encoding reach goals. “In the case of trying to fake a pass, SPOC would help you pick the real player you want to throw the ball to,” says Crawford. “The midposterior intraparietal sulcus (mIPS), would help you to look at the decoy player. Then the angular gyrus (AG) would compare your current hand position to the goal you’re aiming for in order to guide your throw."

Simply put, SPOC picks the goal, while mIPS and AG are involved more closely in planning the motor functions for both our view and our reach.

Scientists at York’s Centre for Vision Research (CVR) used a non-invasive procedure called transcranial magnetic stimulation (TMS) to create activity in these three areas of the brain. TMS delivers mild, split-second electromagnetic pulses, with little to no side effects for participants.

Participants then performed computer-controlled tests to measure the accuracy of their view and reach, while hooked up to eye-tracking headgear. Both left and right hands were tested, as well as reaching with and without visual feedback. By observing differences between subjects tested both with and without TMS over different brain areas, Crawford and his colleagues were able to map the unique responsibilities of each area.

“Because mIPS and AG are involved in calculating both hand and eye movement, and SPOC is dedicated to encoding the reach goal, the whole assembly is likely important for hand-eye coordination,” says Crawford.

“It’s also a good reason to wear a helmet. You wouldn’t want a hard knock on the parietal cortex,” he says.

The study’s lead investigator was kinesiology PhD student Michael Vesia, currently a postdoctoral fellow with the Sunnybrook Research Institute Brain Sciences Research Program at the University of Waterloo. It was co-authored by CVR colleagues Steven Prime, a psychology PhD student, Xiaogang Yan, research associate, and Lauren Sergio, a kinesiology professor in the School of Kinesiology & Health Science in York's Faculty of Health.

The research was funded by the Canadian Institutes of Health Research.