Skip to main content Skip to local navigation

AI fuels research that could lead to positive impact on health care

Brainstorm guest contributor Paul Fraumeni speaks with four York U researchers who are applying artificial intelligence to their research ventures in ways that, ultimately, could lead to profound and positive impacts on health care in this country.

Meet four York University researchers: Lauren Sergio and Doug Crawford have academic backgrounds in physiology; Shayna Rosenbaum has a PhD in psychology; Joel Zylberberg has a doctorate in physics.

They share two things in common: They focus on neuroscience – the study of the brain and its functions – and they leverage advanced computing technology using artificial intelligence (AI) in their research ventures, the application of which could have a profound and positive impact on health care.

In a nondescript room in the Sherman Health Sciences Research Centre, Lauren Sergio sits down and places her right arm in a sleeve on an armrest. It’s an odd-looking contraption; the lower part looks like a sling attached to a video game joystick.

Sergio is putting herself in the shoes of a person who has suffered a stroke that has hampered mobility in the arm. That’s how strokes do their damage – a blood clot shoots to the brain and shuts off motor function. But what if you combined AI engineering and neuroscience research? What if that AI could tell your brain what to do to get your arm to reach and grab something?

Lauren Sergio

Lauren Sergio

Sergio, York Research Chair in Brain Health and Skilled Performance and core member of VISTA (Vision: Science to Applications), is working with IT Universe, a Toronto-based tech company, to develop the sleeve encasing Sergio’s arm. A real stroke patient would also have an EEG cap on their head that measures brainwaves and virtual reality goggles over their eyes showing images of objects, such as a balloon. “Then we say, ‘Look at the red balloon and think about moving your hand to it,’” Sergio explains.

Sergio demonstrates how a person’s arm fits into the devise. Photo credit: Paul Fraumeni

Sergio demonstrates how a person’s arm fits into the device. Photo credit: Paul Fraumeni

Given the disconnection between the brain and the arm that the stroke would have caused, the patient wouldn’t be able to reach the balloon. But the robotic arm can. The team teaches it – through machine learning – to imitate or duplicate the brain activity associated with arm movement. Eventually, after the robot has been trained sufficiently, it takes that information, transmits it to the robotic arm, and facilitates the patient’s hold on the balloon. And in repeating this task, the robotic arm feeds directions back to the human brain. “This helps repair those networks in the brain that were severed by the stroke.”

Douglas Crawford

Doug Crawford

This kind of collaborative research, with a focus solving real-life problems, is exactly what Doug Crawford had in mind when he pitched VISTA to the federal government’s Canada First Research Excellence Fund (CFREF). In 2016, CFREF awarded York $33 million over a seven-year period. With matching funds from the University and contributions from industry partners, the total funding package is $120 million.

“VISTA’s goal is to take the outstanding model of interdisciplinary research laid down by York’s Centre for Vision Research and expand on it to bring even more researchers from a greater of variety of areas together,” says Crawford, VISTA director and Canada Research Chair in Visual-Motor Neuroscience. “And our work is translational – meaning, fundamental [or discovery] research is important, but we’ll see it through to application.”

There are over 80 researchers associated with VISTA. The range of disciplines is breathtaking – from computer science to forestry, from pain management to theatre performance. The potential applications of their work are equally mind-blowing – from the quality of animation in a movie to improving children’s environmental health.

Shayna Rosenbaum is York Research Chair in the Cognitive Neuroscience of Memory and a core member of VISTA. She focuses on clinical neuropsychology, the study of the relationships between brain and behaviour. Her area of specialization is the role of the hippocampus, the part of the brain that stores information we need so we can navigate in our daily lives.

“People with Alzheimer’s become disoriented easily. That’s partially because they’re unable to learn how objects relate to one another, including landmarks. When they try to find their way in a new place, they often have difficulty.

Shayna Rosenbaum

Shayna Rosenbaum

“We’re interested in what happens when the person navigates familiar places. Because, even then, individuals in early stages of Alzheimer’s can have difficulties. So, we'd like to detect this as early as possible because we think it’s a good gauge of whether someone will develop the disease,” Rosenbaum explains.

VISTA has enabled her to collaborate with James Elder (York Research Chair in Human and Computer Vision) and Matthew Kyan, both in York’s department of electrical engineering and computer science. They are leveraging AI to develop real-world tasks that can be used to test older adults’ navigation abilities.

“We create situations in the computer program where an older adult has to circumvent the original, known route to get to a particular location. Some patients have difficulties generating the detour. They eventually arrive at their goal location, but it’s very inefficient.”

Rosenbaum has applied for funding for a project involving the creation of a computer model of the interior of Baycrest Health Sciences, a research and teaching hospital for older adults. “We’ll put the model into virtual reality and use it to see how people learn to navigate in Baycrest. We hope to pre-expose individuals who plan to move into Baycrest to reduce instances of wandering or disorientation. Our technology might give them a sense of their new space and reduce their anxiety.”

Joel Zylberberg came to York in 2019. He’s the Canada Research Chair in Computational Neuroscience, a fellow at CIFAR (Canadian Institute for Advanced Research) and a core member of VISTA.

Among his many ventures in applying AI to neuroscience, Zylberberg is looking into using functional magnetic resonance (fMRI) to teach computers to mimic brain activity. His goal is to help radiologists with their diagnoses.

Joel Zylberberg

Joel Zylberberg

“A few University of Alberta radiologists have agreed to sit in a scanner and examine radiology images of their patients and do their diagnostic tasks, while we look at what their brains are doing. Then we’ll use their brains as the teacher for our deep neural nets,” Zylberberg explains.

He says the goal isn’t to replace radiologists with machines. “It’s more likely to be a critical decision support tool: the radiologist would look at the image, feed it into computer software that mimics the learning ability of the brain and then study the output to see if they missed something.”

All four of these York researchers are excited about the possibilities, while also aware of the challenges that the brain presents.

“AI can help us take the brain signals and try to figure out what the brain might be trying to send from the spinal cord to the muscles to the arm – something a baby picks up easily within days,” says Sergio. “The technology isn’t perfect yet, but we’re making huge leaps. What’s happening in robotics now is astounding.”

For Zylberberg, what he values most is the multidisciplinary nature of VISTA. “My lab’s in a weird kind of space. We’re not biologists or computer scientists. I’m a physics professor but I’m not much of a physicist. So, without something like VISTA there wouldn’t be a research community that my lab would fit into. VISTA has assembled an incredible community that covers the whole spectrum.”

Rosenbaum stresses the real-world focus. “VISTA has really allowed for this kind of work I’m doing. It's important to show the link between the fundamental [discovery] research that we do, learning how the brain and AI work, and how that might apply to the real world and actually help people. VISTA is giving us that opportunity.”

To learn about Sergio’s work, visit her Faculty profile page. For more information on Rosenbaum, visit her Faculty profile page. To learn more about Zylberberg, see his profile page. For more on Crawford and VISTA, visit the VISTA website.

To learn more about Research & Innovation at York, follow us at @YUResearch; watch our new animated video, which profiles current research strengths and areas of opportunity, such as Artificial Intelligence and Indigenous futurities; and see the snapshot infographic, a glimpse of the year’s successes.

Paul Fraumeni is an award-winning freelance writer who has specialized in covering university research for more than 20 years. To learn more, visit his website.