Your Class and Generative AI
"We don't fully lean into AI and teach how to best use it, and we don't fully prohibit it to keep it from interfering with exercises in critical thinking. We're at an awkward middle ground where nobody knows what to do..."
-Owen Kichizo Terry, May 2023 (I’m a Student. You Have No Idea How Much We’re Using ChatGPT. (chronicle.com) Links to an external site.)
As an instructor, it's important to consider and formulate your own philosophy about AI use in your class, and especially if and how it will vary by assignment, so that you'll be able to clearly communicate that to your students. Each instructor is likely to have different policies, which could create a confusing landscape for students if those different policies are not clearly conveyed. Lack of transparency could lead to academic integrity violations, whether unintentional or recognized by the student.
AI Assessment Scale
In each course, it's up to the instructor to establish what is and is not considered an academic integrity violation when it comes to AI and then discuss that with students. Before developing an AI policy for your course, though, we recommend first considering AI use at the assignment level and using those decisions to guide your course-level policy.
Perkins et al. (2023) created an AI Assessment Scale (AIAS) that could serve as a helpful starting point when developing and communicating your assignment-level AI policies. Their assessment scale has five levels, with each level allowing a different degree of AI assistance. Level 1 indicates to students that AI may not be used at all for the assignment, whereas Level 5 allows for full AI use. The idea, then, is that after introducing students to this AI Assessment Scale, instructors could label each assignment with a number from the scale to inform students about whether AI use is acceptable and, if so, to what extent. We've reproduced that AI Assessment Scale below and encourage you to read The AI Assessment Scale: Version 2 – Leon Furze Links to an external site. for additional context.
1 | No AI |
The assessment is completed entirely without AI assistance. This level ensure that students rely solely on their knowledge, understanding, and skills. AI must not be used at any point during the assessment. |
2 | AI-Assisted Idea Generation and Structuring |
AI can be used in the assessment for brainstorming, creating structures, and generating ideas for improving work. No AI content is allowed in the final submission. |
3 | AI-Assisted Editing |
AI can be used to make improvements to the clarity or quality of student created work to improve the final output, but no new content can be created using AI. AI can be used, but your original work with no AI content must be provided in an appendix. |
4 | AI Task Completion, Human Evaluation |
AI is used to complete certain elements of the task, with students providing discussion or commentary on the AI-generated content. This level requires critical engagement with AI-generated content and evaluating its output. You will use AI to complete specific tasks in your assessment. Any AI created content must be cited. |
5 | Full AI |
AI should be used as a "co-pilot" in order to meet the requirements of the assessment, allowing for a collaborative approach with AI and enhancing creativity. You may use AI throughout your assessment to support your own work and do not have to specify which content is AI generated. |
This scale is just one example of how to increase transparency for students surrounding acceptable (or unacceptable) AI use in a course. Each instructor can develop a policy and communication method that works for their own specific context. Some instructors might prefer to set the guidelines in advance, like the AI Assessment Scale above, whereas others might prefer a more collaborative approach with their students Links to an external site., and still others will land on a method somewhere in between. The critical element, though, is to ensure that there is clear communication with your students about these policies.
Designing — or Redesigning Assignments in an AI World
Along with creating and articulating clear AI policies, it may be necessary to make some design decisions about assignments. Some assignments may need to be modified to include greater scaffolding — building on prior student work and research, and emphasizing process, feedback, and revision rather than finished product. Some assignments could be modified to require a more performance-based deliverable, like a presentation or an interview — something that AI can't do so easily. Other assignments might benefit from a more critical, proactive use of AI, such as having students critique AI output and identify its shortcomings or potential biases.
The full spectrum of assignment design in an AI-rich world is beyond the scope of this module, but we'll provide some case studies and ideas below. Feel free to reach out to an Instructional Designer if you'd like to dive more deeply into any of these ideas, or be on the lookout for upcoming trainings, workshops, or communities of practice!
Going Deeper: Designing Assignments in an AI world
Five Days in Class with ChatGPT
Links to an external site.
Rid, The Alperovitch Institute. (Jan. 22, 2023)
This narrative from a computer science instructor is a great case study of using AI in the classroom. This example is for a programming heavy cybersecurity but we can imagine a similar workflow for other lab based courses. Rid describes how ChatGPT served as a tool for students to ask foundational questions without disrupting the class flow, allowing for real-time understanding and engagement with complex topics. The author was initially skeptical about AI’s role in education but concludes that AI can democratize learning by providing instant, accessible explanations.
Eight Ways to Engage with AI writers in Higher Education
Links to an external site.
McKnight, Times Higher Education (Oct. 14, 2022)
This article discusses the integration of AI writers into higher education and offers eight strategies for educators to help students engage with these technologies responsibly. McKnight emphasizes the importance of understanding both the potential and limitations of AI text generators. The author argues that just as spellcheck and predictive text have become part of our writing processes, AI writers will also become a standard tool. The article provides practical advice on how educators can incorporate AI writers into their curricula to enhance research, critique, and comparison skills among students.
Using AI to Implement Effective Teaching Strategies in Classrooms: Five Strategies, Including Prompts
Links to an external site.
Mollick, SSRN (Mar. 24, 2023)
This paper provides guidance for using AI to quickly and easily implement evidence-based teaching strategies that instructors can integrate into their teaching. The authors discuss five teaching strategies that are traditionally hard to implement due to time and effort constraints, but can be more manageable using AI tools.
ChatGPT and the Rise of AI Writers: How Should Higher Education Respond?
Links to an external site.
Gleason, Times Higher Education (Dec. 9, 2022)
This article makes the case that ChatGPT can generate high-quality structured text, from poems and essays to marketing materials and code. The author discusses the potential impact of AI on higher education, envisioning applications such as virtual tutors, research assistants, and improved collaboration among educators, students, and researchers. Importantly, she acknowledges the criticisms surrounding AI-generated text, including accuracy issues and biases.
AI-Resistant Assignments: 10 Strategies for K-12 Teachers
Links to an external site.
Campbell, Richard C. Campbell's Blog (2024).
This article outlines ten practical strategies for teachers to create assignments that encourage student engagement, creativity, and reflection, reducing the likelihood of students using AI tools inappropriately. Although this article was written for a K-12 audience, many of its suggestions are relevant to a college classroom as well.
How Do I (Re)design Assignments and Assessments in an AI-Impacted World?
Links to an external site.
Center for Teaching & Learning, University of Massachusetts (2024)
The UMass CTL created this useful page that addresses the design of assignments in an AI world. They provide a comprehensive overview of strategies to create AI-resistant assignments that promote critical thinking, personal reflection, and unique insights that AI cannot replicate. It emphasizes the importance of assignment transparency, the integration of real-world applications, and the promotion of learning as a social experience.
ChatGPT Cheating: What to Do When It Happens
Links to an external site.
Klein, Education Week. (Feb. 21, 2023)
This short article provides a concise, practical, actionable framework for minimizing the potential for AI-enabled cheating in your courses, and for what to do when you suspect cheating has occurred. Of particular note: AI-writing detectors are not reliably accurate, often flag false positives, and cannot be used as sole evidence in an academic dishonesty report.
References
Perkins, M., Furze, L., Roe, J., & MacVaugh, J. (2023). Navigating the Generative AI Era: Introducing the AI Assessment Scale for Ethical GenAI Assessment. arXiv:2312.07086v1 [cs.AI]. https://doi.org/10.48550/arXiv.2312.07086.
Click Next to go to our final activity: Developing an Assignment-level AI Policy