Skip to main content Skip to local navigation

Multi-objective backpropagation for training neural networks

Home » Dean's Award for Research Excellence (DARE) » DARE for Students » DARE Research Project Postings » Multi-objective backpropagation for training neural networks

Multi-objective backpropagation for training neural networks

Faculty Member's Name: Stephen Chen
Faculty Member's Email Address: sychen@yorku.ca
Department/School: School of Information Technology
Project Title: Multi-objective backpropagation for training neural networks


Description of Research Project

Back propagation is the standard method to train a neural network. It is primarily designed for supervised learning towards a single error function. Beyond maximum accuracy, AI models often have other considerations which can lead to multi-objective optimization problems. How can back propagation by modified to support multiple objectives?


Undergraduate Student Responsibilities

- Code a benchmark implementation of neural network
- Evaluate the NN for both accuracy and a secondary objective
- Modify back propagation to consider this secondary objective
- Develop new visual methods to display and analyze the collected data


Qualifications Required

- Advanced programming skills with specific familiarity with OOP and the Python programming language, especially the SciPy and NumPy libraries
- Excellent data management habits including specific familiarity with key tools such as Excel and Jupyter notebooks
- Functional understanding of data preprocessing and data analysis
- Strong interest in machine learning, optimization, and real-world research

Interested in this project posting?

Submit your resumé and unique cover letter for this projects to the faculty supervisor. Deadline: February 6, 2026 by 4 p.m.

Categories: