
Information and Strategies for Instructors
This page has been created to help instructors understand the capabilities of generative Artificial Intelligence (AI) technology and provide strategies for helping students avoid using it to engage in academic misconduct. The following sections provide information about these tools.
- About Artificial Intelligence (AI) Technology
- Addressing AI Technology with Students
- Leveraging AI Technology as a Learning Tool
- Detection and Future Considerations
- Further Resources
Please refer to the Academic Integrity and Generative Artificial Intelligence (AI) Technology document distributed by the ASCP committee in February 2023. This document draws on York's Senate Policy on Academic Honesty to provide clarity on the use of this technology for academic work.
Note: Given that AI technology is continuously evolving, the content on this page will be monitored and updated as new information becomes available.
About Artificial Intelligence (AI) Technology
Generative artificial intelligence (AI) tools have recently received much attention in the media and within educational institutions. These tools are able to mimic, and at times to exceed, human abilities to research, write, problem-solve, create art, produce videos, and even to “learn” and evolve. As such, there is concern that generative AI tools/apps can be used to help students engage in academic misconduct.
Since the release of ChatGPT (GPT stands for Generative, Pre-trained, Transformer), different types of generative AI tools have been developed that are able to generate text, code and images (e.g. Bing Chat, Google Bard, Alpha Code, MidJourney and Stable Diffusion). Updates are continuously occurring, including GPT-4, which was released on March 14, 2023 and boasts greater accuracy than ChatGPT (OpenAI claims GPT-4 is 60% less likely to present false information). On March 23, 2023, OpenAI announced the introduction of plugin support for ChatGPT so that it can access information beyond training data including the ability to access the Internet. To learn more about these updates, refer to the articles listed in Updates in AI Technology.
Addressing AI Technology with Students
As an instructor, anticipate that students will be curious about these tools and that they require clear direction about whether they are permitted to use them for academic work within your course. As such, instructors are encouraged to engage students in an open discussion about AI apps, and how these intersect with academic integrity.
Institutional Expectations
When talking to students, let them know about institutional expectations:
- any unauthorized use of ChatGPT (or other AI tools) on assessments is considered to be a breach of academic honesty.
- Remind them of York’s Senate Policy on Academic Honesty and provide examples about how the unauthorized use of this technology can lead to breaches of:
- cheating, if they are using an AI tool to gain an improper advantage on an academic evaluation when it has not been authorized by their instructor (Section 2.1.1), or
- plagiarism if they are using images created by another, i.e. through the use of DALL-E or another image-generating tool when not authorized by their instructor and not attributed to the creator (Section 2.1.3).
- Be very clear about your expectations, and explicit when providing assignment instructions. To help reduce confusion, ensure these expectations are communicated in various ways, such as including them in eClass, course syllabi, instruction guidelines and repeated in class
- Explain how different instructors can have different expectations for AI tools, and if it is permitted by one instructor, this does not mean it will be permitted by others.
Related Student Discussion Topics
You can use the points above as a springboard to further discuss the ethical implications of AI technology in your discipline and any potential problems that can arise. Some guiding discussion questions may include:
- What do you know about AI apps?
- Have you used them before? If you have, in what ways?
- What were your experiences?
- In your field/discipline, what are some ethical issues that can arise from using these apps?
- How can you ethically use AI apps to support your learning? (Eaton, 2023)
Grey Areas and Ethical Concerns
You can let students know that there are still many grey areas about AI apps and that there are no clear answers or guidance as of yet (Watkins, 2022). For instance, there are no guidelines on how to appropriately cite content that has been generated by these apps. As well, besides threats to academic integrity, there are implications that by entering prompts and interacting with AI apps, users are helping the technology to improve and evolve. Additionally, ChatGPT collects a significant amount of data from its users which can be shared without a user’s knowledge or permission (Trust, n.d.).
Related Student Discussion Topics
Such grey areas can also serve as the springboard for additional classroom discussion and activities, for instance, some suggestions from Eaton (2023) include:
- Develop a method for students to communicate when they have used AI apps in their academic work. For instance, they can start with this statement, which has been adapted from OpenAI:
The author generated this [text, image, or video] in part with [insert provider], a language-, image-, code-, or video- generation model. The author verifies that they have reviewed, edited and revised the draft provided and takes responsibility for the content of this [assignment name]. (Open AI, 2022)
- Co-create class guidelines/a charter with your students regarding the use of artificial intelligence apps in your courses. Should it be used? If so, what are some guidelines pertaining to its use?
- Include an artificial intelligence discussion thread so that students can share information, ask questions, post articles, etc.
- Have students read OpenAI’s privacy policy and terms of use pertaining to ChatGPT and DALL-E. Discuss possible concerns and implications of having their data collected.
- If users are helping the technology to improve and evolve through providing prompts, what are some future implications for work in their field? Will improved AI technology be a source of help or a hindrance?
Leveraging AI Technology as a Learning Tool
Generative AI technology has created a need to revise assessment practices, and at the same time it offers some new possibilities for in-class learning.
Short-Term Strategies
When it comes to assessment, in the short-term, some strategies that can be applied include:
- asking students to hand-write assessments in class
- replacing written assessments that are submitted through eClass with assessments that are completed in class, including presentations, reflective writing assignments, in-class tests or short essays
- requiring students to submit their rough notes along with the final submission of their work
Note: If using the above strategies, be mindful of potential accessibility and equity issues that may arise when shifting assessment modalities.
Longer-Term Strategies
With more time, assessments can be redesigned so that students are not submitting work that AI apps can easily produce. Some ideas for redesigning assessments are listed below.
Consider expanding or replacing written assessments to:
- focus more on the process of the writing assignment rather than on the final product
- have students emphasize evidence of original thought and critical thinking as AI tools have been shown to be weak at demonstrating these higher-order skills
- ask students to use current sources (post-September 2021)
- ask students to apply personal experience or personal knowledge to course topics
- or, replace a written assessment with a multimodal one
You may also want to update your grading criteria or rubrics to emphasize assessment of deeper discipline-specific skills such as argumentation, use of evidence, or interpretive analysis, rather than the mechanics of writing and essay organization. This can help re-weight your assessments in favour of student learning and away from skills easily performed by AI tools.
Many educators are currently experimenting with integrating generative AI technology into their assessment design. Keeping in mind the limitations of AI, if you decide to incorporate such tools into assessments, some ways that students can use technology like ChatGPT to apply higher-order skills are to:
- generate a ChatGPT response to a particular question, and then write an analysis of the strengths and weaknesses of the ChatGPT response
- fact-check the responses that ChatGPT provides to identify incorrect information
- generate a paper from ChatGPT and evaluate its logic, consistency, accuracy and bias
- use ChatGPT to create an outline that students can then use to develop an essay. (D’Agostino, 2023; Montclair State University, n.d; Trust, n.d.)
In-Class Activities
Beyond formal assessments, generative AI tools can also be used in ungraded or low-stakes learning activities during class time. Bringing this tech into lectures or discussions can help students understand how and when to use AI technology effectively and ethically, and in ways that align with the norms and standards of your disciplinary context. Some learning activities you might consider include:
- using AI generated text as the starting point for class discussion on a particular topic. What does it get right? What is it missing? How would it need to be revised to meet the scholarly standards of your field?
- having small teams of students experiment in using AI to create text about a given subject, and then comparing the results (what grade would they assign its response using a course rubric?) and/or the process (what prompts and tweaks were needed to generate the text?)
- engaging in a class debate against generative AI tech. Use the tool to generate counterarguments that can help them explore perspectives and strengthen their own arguments
- asking your students! Gather anonymous feedback about whether they are using the tool, what value it provides them, and how they think it should be used in your disciplinary or teaching context
Detection and Future Considerations
When it comes to AI detectors, instructors are encouraged to become informed about the limitations and associated privacy concerns. For instance, to use detectors, most require that you copy and paste content that you suspect has been generated by AI apps. Some, like OpenAI's AI Text Classifier require users to register. Additionally, the reliability of detectors has been questioned, for instance, this article from the Guardian discusses how OpenAI's Text Classifier tool correctly identified only 26% of AI-generated English content.
On April 4, 2023 Turnitin released an AI detection tool as part of its current platform. However, as it has yet to undergo an evaluation at York, it is recommended that instructors avoid using the results as evidence that a breach occurred.
If you have any questions or require support on academic integrity matters, contact academicintegrity@yorku.ca.
Further Resources
Bailey, J. (2022, December 7). Why Teachers Are Worried About AI
D'Agostino, S. (2023, January 31). Designing Assignments in the ChatGPT Era
Eaton, S. and Anselmo, L. (2023, January). Teaching and Learning with Artificial Intelligence Apps
McClennen, N. & Poth, R. Education is about to radically change: AI for the masses.
McKnight, L. (2022, October 14). Eight ways to engage with AI writers in higher education.
McVey, C. (2022, December 05). POV: Artificial Intelligence Is Changing Writing at the University. Let’s Embrace It
Mollick, E.R. & Mollick, L. (2022). New Modes of Learning Enabled by AI Chatbots: Three Methods and Assignments
Prochaska, E. (2023, January 23). Embrace the Bot: Designing Writing Assignments in the Face of AI
Schulten, K. (2023, January 23). Lesson Plan: Teaching and Learning in the Era of ChatGPT
Webb, M. (2022, August 04). What’s next for AI in higher education?
Alby, C. (n.d.). ChatGPT: Understanding the new landscape and short-term solutions
Alby, C. (2023, January 7). Can ChatGPT be a blessing?
Ditch that Textbook (2002, December 17). ChatGPT, Chatbots and Artificial Intelligence in Education
Hotson, B. & Bell, S. (2023, April 9). Academic writing and ChatGPT: Step back to step forward
Kovanovic, K. (2022, December 15). The dawn of AI has come, and its implications for education couldn't be more significant
Marr, B. (2023, March 3). The Top 10 Limitations of ChatGPT
Morrison, R. (2022, November 16). How to Identify AI Generated Text
Pizarro Milian, R. & Janzen, R. (2023, March 29). How are Canadian postsecondary students using ChatGPT?
Schulten, K. (2023, January 24). How Should Schools Respond to ChatGPT? (Student Opinion)
Blain, L. (2023, March 24). ChatGPT can now access the internet and run the code it writes
Hachman, M. (2023, March 23). ChatGPT’s new web-browsing power means it’s no longer stuck in 2021
Hearn, A. (2023, March 15). What is GPT-4 and how does it differ from ChatGPT
Metz, C. (2023, March 14). OpenAI Plans to Up the Ante in Tech’s A.I. Race
OpenAI. (2023, March 14.) GPT-4
D’Agostino, S. (Jan 12, 2023). ChatGPT advice academics can use now. https://www.insidehighered.com/news/2023/01/12/academic-experts-offer-advice-chatgpt
Eaton, S. (2023). Teaching and learning with artificial intelligence apps. University of Calgary. https://taylorinstitute.ucalgary.ca/teaching-with-AI-apps
Montclair State University (n.d.). Practical responses to ChatGPT. https://www.montclair.edu/faculty-excellence/practical-responses-to-chat-gpt/
OpenAI (2022). Sharing and publication policy. https://openai.com/api/policies/sharing-publication/#content-co-authored-with-the-openai-api-policy
Trust, T. (n.d.). ChatGPT and education. https://docs.google.com/presentation/d/1Vo9w4ftPx-rizdWyaYoB-pQ3DzK1n325OgDgXsnt0X0/mobilepresent?slide=id.p
Watkins, R. (2022). Update your course syllabus for chatGPT. https://medium.com/@rwatkins_7167/updating-your-course-syllabus-for-chatgpt-965f4b57b003