Analyzing the implications of AI for your course
Here we will guide you through analysis and self-reflection about how AI can affect your own courses and teaching practice. We have organized our guidance into three broad topic areas: academic integrity, student success, and workload balance.
Exploring the pedagogical uses of AI chatbots
- AI chatbots might be used in a variety of ways for teaching, such as providing feedback, tutoring, coaching, teamwork tasks, simulations, and more.
- Strategies for prompting chatbots include using structured prompts for outcome-oriented tasks, conversing naturally for open-ended tasks, providing context details, and so on.
- Practice using a chatbot to better understand its capabilities and limitations.
- Potential risks around chatbot use include risks around truthfulness, privacy, bias and stereotypes, and equity and access.
Outcomes for this module
In this module, you will analyze how AI chatbots fit into your own course relative to the broader campus context around AI and technology. The nuances of these issues will vary depending on the unique characteristics of your discipline area and course. We encourage you to think carefully about your specific situation when going through this module.
After completing this module, you should be able to:
- Describe campus policy guidance from the Office of Community Standards regarding AI use.
- Analyze how AI might impact your specific course through the lenses of academic integrity, student success, and workload balance.
Thinking about your own course
As you move through this module, we ask you to think about the characteristics of the course or courses that you teach. We hope that by focusing on your specific course you can better apply the ideas and insights you gain through these modules to develop something actionable and meaningful to you. Consider the prompt "What do you want students to learn in your course?" and respond to the poll below.
Three areas of focus
All of us will likely experience the potential impacts of AI as wide-ranging, emerging, and rapidly evolving. We cannot address them all here; instead, we will focus on three particular areas of concern for instructors: academic integrity, student success, and workload balance.
Academic integrity
Many instructors express concern that students will use AI to shortcut the learning process and that students will present AI-generated text as their own work. When thinking about AI use in your Stanford courses, you and your students should refer to campus policies, the Honor Code, the Student Judicial Charter, and what constitutes plagiarism. We ask you to consider what you can do to promote trust, integrity, and honorable behavior.
Student success
We all have a responsibility to support students in achieving success, which can have many dimensions. Supporting students' success might include preparing them for the future, helping them to meet their own goals, and supporting their well-being. Consider these factors when determining whether AI tools align with your course’s learning outcomes, how you support students in using AI tools, how you and your students understand the risks and benefits of AI, and so on. Learning about new AI tools, adapting your course design, and employing evidence-based teaching techniques are also ways to support your and your student's success. Consider how your efforts can benefit all students of diverse backgrounds, identities, and experiences.
Workload & balance
We must consider promoting positive mental health and well-being to sustainably and meaningfully support students. Adopting a new tool or growing your teaching practice meaningfully can require a lot of work. We recognize you often have competing priorities and limited time. Therefore, when considering the impact of AI tools we should all consider the time needed to learn how to use such tools and implement changes to a course or one's teaching practice, while also considering the benefits of adopting such tools and alignment with broader organizational goals.
Campus policy guidance on AI use
The Office of Community Standards (OCS) has stated the following:
"Absent a clear statement from a course instructor, use of or consultation with generative AI shall be treated analogously to assistance from another person. In particular, using generative AI tools to substantially complete an assignment or exam (e.g. by entering exam or assignment questions) is not permitted. Students should acknowledge the use of generative AI (other than incidental use) and default to disclosing such assistance when in doubt. Individual course instructors are free to set their own policies regulating the use of generative AI tools in their courses, including allowing or disallowing some or all uses of such tools. Course instructors should set such policies in their course syllabi and clearly communicate such policies to students. Students who are unsure of policies regarding generative AI tools are encouraged to ask their instructors for clarification."
In this context, we note the importance of deciding the best policy regarding AI use for your own unique course and context.
Self-evaluation of your course
Here is a method to help you analyze how AI chatbots might impact a specific course that you teach. We devised questions intended to stoke your thinking as you analyze your own course and teaching practice. We structure what follows like a rubric with an evaluation metric for each criterion, organizing the material into three focus areas: academic integrity, student success, and workload balance. We then organize each area into sub-categories:
- Assessments: Measuring how well students learn what you intended and how you assign grades to students
- Student support: Providing students with what they need to succeed
- Learning activities: Activities that students do to reinforce learning
- Inclusion and belonging: Supporting a wide range of diverse students
- Discipline area: Unique characteristics of your discipline area
If you answer "Yes," "Very much," or "A lot" to many of the questions below, you might consider a more open policy and consider integrating the use of AI tools into your course.
Academic integrity
Sub-category | Criterion |
---|---|
Assessments | To what degree do your learning objectives align with higher-order thinking skills, such as creating original work, proposing solutions to complex problems, and internalizing values?* |
How effectively do your current assessments, rubrics, and so on measure your learning objectives?* | |
How difficult would it be for an AI chatbot to successfully complete your current assessments? | |
How clearly and consistently does your method help you in fairly grading student work? | |
To what degree does your course provide multiple opportunities and forms of assessment and avoid single high-stakes assessments? | |
Student support | How well do you communicate to students what integrity means in your course? |
How clearly do you communicate to students any course or campus policies about academic integrity and AI chatbot use? | |
How well might your students know how to use AI chatbots in responsible and honorable ways? | |
Learning activities | To what degree do ungraded (and therefore not subject to OCS policy) learning activities factor into your course? |
To what degree have students already mastered foundational skills that AI chatbots might augment? | |
Inclusion and belonging | To what degree do you model integrity and the responsible use of AI tools? |
To what degree might your students react positively to allowing AI chatbot use in your course? (Consider how much students might feel pressured vs. protected with a stricter policy, and how much they might feel tempted vs. trusted with a looser policy.) | |
To what degree does your course foster belonging, psychological safety, integrity, and intrinsic motivation to succeed?* | |
Discipline area | How important are the ethical issues concerning AI use in your field? |
*See the "Learn more" section below for links to resources on these topics.
Student Success
Sub-category | Criterion |
---|---|
Assessments | To what degree could integrating AI chatbots make your assessments more compelling or effective? |
How well do current assessments align with students' goals and needs? | |
Student support | How well can or do you provide support for students on how to use AI chatbots effectively? |
To what degree are your students independent, experienced, and skilled in self-directed learning with technology tools? | |
To what degree does your course promote and do your students leverage relevant support services, such as academic coaches, writing tutors, language partners, and so on? | |
To what degree do you and your students understand and consent to the inherent privacy and data security risks that come with using AI tools? | |
Learning activities | To what degree could AI chatbots make learning activities more compelling or effective? |
To what degree do you value experience using AI chatbots for students in your course? | |
Inclusion and belonging | To what degree do you understand the different issues, challenges, and preferences of students typically enrolled in your course? |
To what degree would using AI chatbots benefit students, particularly first-generation/low-income, under-represented minority, or students with less academic preparation? | |
How flexible do you consider yourself and your course in adapting to the needs of diverse students? | |
To what degree can you support equal access for students to AI tools in terms of affordability and accessibility? | |
To what degree can you give students and your teaching team informed choices and alternatives in how or if they use AI tools? | |
Discipline area | How important is it for students in your discipline area to have experience with AI tools or understand AI-related issues? |
Workload balance
Sub-category | Criterion |
---|---|
N/A | How easy would it be to adapt different aspects of your course to integrate AI chatbot use? |
How positive and motivated do you feel about integrating AI into your course or teaching? | |
To what degree have you identified possible enhancements to your course that support the use of AI chatbots? | |
To what degree do you have the time and resources to make changes to your course, or gain the skills needed to do so, while maintaining your own well-being? | |
How many resources, collaborators, colleagues, and communities do you have to support you in this work? | |
To what degree does integrating AI tools align with departmental or unit strategic goals? |
Assess and reinforce your learning
We offer this activity for you to self-assess and reflect on what you learned in this module.
Stanford affiliates
- Go to the Stanford-only version of this activity
- Use your Stanford-provided Google account to respond.
- You have the option of receiving an email summary of your responses
- After submitting your responses, you will have the option to view the anonymized responses of other Stanford community members by clicking Show previous responses.
Non-Stanford users
- Complete the activity embedded below.
- You have the option of receiving an email summary of your responses.
- Your responses will only be seen by the creators of these modules.
Learn more
- Creating Learning Outcomes, Stanford Teaching Commons.
- Ten Strategies to Promote Student Flourishing, Stanford Teaching Commons.
- Teaching and Learning with Generative AI, Stanford Center for Teaching and Learning.
- CRAFT AI Literacy Resources, Stanford Graduate School of Education.
- How ChatGPT Could Help or Hurt Students With Disabilities, The Chronicle of Higher Education.
- We tested a new ChatGPT-detector for teachers. It flagged an innocent student., Washington Post.
Works cited
Fowler, G. A. (2023, April 14). Analysis | We tested a new ChatGPT-detector for teachers. It flagged an innocent student. Washington Post. https://www.washingtonpost.com/technology/2023/04/01/chatgpt-cheating-detection-turnitin/
McMurtrie, B. (2023, May 26). How ChatGPT Could Help or Hurt Students With Disabilities. The Chronicle of Higher Education. https://www.chronicle.com/article/how-chatgpt-could-help-or-hurt-students-with-disabilities
Generative AI Policy Guidance | Office of Community Standards. (n.d.). Retrieved August 28, 2023, from https://communitystandards.stanford.edu/generative-ai-policy-guidance
Stanford CRAFT. (n.d.). Retrieved July 28, 2023, from https://craft.stanford.edu/
Creating your course policy on AI
Example syllabus statements, suggestions, and sample sentences for creating your own AI course policy.
Go to the next module
Learning together with others can deepen the learning experience. We encourage you to organize your colleagues to complete these modules together. Consider how you might adapt, remix, or enhance these modules for your own needs. If you have any questions, contact us at TeachingCommons@stanford.edu. This guide was created by Stanford Teaching Commons and is licensed under Creative Commons BY-NC-SA 4.0 (attribution, non-commercial, share-alike).