The continuing development of generative artificial intelligence (AI) tools, capable of producing natural language, computer code, images, or other media in response to users’ queries, has the potential to impact teaching and learning in many ways. This article offers some guidance as we begin to more deeply understand and engage with AI tools in our teaching and learning practices.
Adapting teaching practices in response to AI
Natural language AI chatbots such as ChatGPT use vast amounts of text data to predict words or phrases in a given context. They can be used to mimic natural human language and to aid a wide variety of language-related tasks in much more sophisticated ways than previously possible. Other emerging AI tools, also trained on large datasets, focus on producing computer code and images in response to user prompts. At the Center for Teaching and Learning (CTL), we have heard a lot of interest, concerns, and questions from the campus community.
Many wonder how such tools might aid students in completing assignments or necessitate changes in how we assess student learning. Some are concerned about the impact these tools may have on academic integrity and issues of originality in writing. Others are interested in how such tools can aid instructors in teaching writing skills. And many are simply curious about AI tools, how they work, and what they can do.
While any powerful new technology can potentially disrupt established practices, they also hold the promise of improvement and innovation. Educational technology is continuously changing, and the emergence of AI tools is another step in this evolution. As educators, we again have the opportunity to come together to adapt to the diverse and ever-changing needs of learners, learning contexts, and new tools and resources. We encourage you to thoughtfully engage with these technologies and consider how they may inform your teaching practice.
Below, we suggest several key teaching practices, some already familiar, to help faculty, lecturers, academic teaching staff, and teaching assistants navigate these emerging technologies and remain on the leading edge of developing pedagogic strategies, course designs, and curricula that integrate and address emerging technologies and understandings in the science of learning.
Design assignments for an AI world
In many ways, teaching with AI tools in mind relies on a familiar concept: assignments that support students in developing linked thinking and writing (or other generative skills) are more effective. This includes the process of discussing, drafting, and revising ideas in relation to sources and evidence.
The use of AI tools may be less relevant when students experience a structured set of steps on the way to a final product, particularly when those steps include learning activities that elicit students’ own thinking, include formative feedback from instructors and peers, and build in drafts and revisions. For example:
- Assignments from Program in Writing and Rhetoric (PWR) courses often provide “scaffolding”—structure building up to complex tasks.
- Formative assessment and feedback strategies can help students understand their own progress and adopt a growth mindset toward their learning.
- Inclusive discussions in class support students in exploring ideas during the process of writing or generating work for an assignment.
Some educators are also exploring ways of intentionally incorporating various forms of structured interaction with AI tools into their assignments and learning activities. Such exploration may help instructors and students develop a shared understanding of the functions and limitations of AI tools, appropriate ways of using and citing the use of such tools, and the ethics and learning implications surrounding these emerging technologies.
Engage with students about AI tools
Discussing academic integrity with students in the context of your course and assignments remains an important aspect of helping students navigate your specific expectations and focus on important learning outcomes (which may or may not be supported by the use of AI tools).
Transparency in learning and teaching could promote students' awareness and understanding of how they learn in light of new AI capabilities. Remember that students are newly exploring emerging AI tools and may not have fully thought through their affordances and limitations. You might help students better understand how to approach the work and determine whether AI tools are appropriate by explaining why your assignments are structured as they are, and linking what you intend for students to learn at each stage of the assignment with the way the assignment is sequenced.
As we learn more about how students use these tools, our guidance may become better informed. Conversations with students about AI in education can also help them to understand how developing their own thinking is critical for their growth. Sometimes the messy and imperfect steps on the way toward formulating their original arguments, code, solutions, analyses, and creative work are critical in the learning process.
Update your course guidance or policies
If your course offers guidance or has policies addressing permitted aid and resources for assignments and other coursework, this may be a good time to update and clarify which tools are and are not allowed, and why.
The Board on Judicial Affairs (BJA) has been asked to address the Honor Code implications of generative AI tools such as ChatGPT, Bard, DALL-E, and Stable Diffusion. To give sufficient space for instructors to explore uses of generative AI tools in their courses, and to set clear guidelines to students about what uses are and are not consistent with the Stanford Honor Code, the BJA has set forth policy guidance regarding generative AI in the context of coursework:
Absent a clear statement from a course instructor, use of or consultation with generative AI shall be treated analogously to assistance from another person. In particular, using generative AI tools to substantially complete an assignment or exam (e.g. by entering exam or assignment questions) is not permitted. Students should acknowledge the use of generative AI (other than incidental use) and default to disclosing such assistance when in doubt.
Individual course instructors are free to set their own policies regulating the use of generative AI tools in their courses, including allowing or disallowing some or all uses of such tools. Course instructors should set such policies in their course syllabi and clearly communicate such policies to students. Students who are unsure of policies regarding generative AI tools are encouraged to ask their instructors for clarification.
(Guidance adopted Feb 16, 2023)
Join the conversation
We invite you to consider these options when beginning to engage with AI tools. Let’s come together to lead Stanford toward forward-looking, effective, and human-centered uses of educational technologies.
Schedule a consultation
Do you have questions, or are you looking to discuss this topic further? CTL staff are ready to consult and discuss your assignments and course policies in light of new developments in AI. As with all our teaching consultations, we’ll start with your goals and priorities and collaborate with you to generate and explore pedagogical options. Request a consultation here
Share your experience with us
If you have made adjustments to your courses or assignments, or engage students in positive, ethical, and appropriate uses of generative AI tools, we’d like to hear about your experience, which in turn will help us support other Stanford instructors and plan future communications and discussions. Connect with us using our consultation form to describe your experience so that we can follow up with you.
More guidance on teaching with AI is coming
There are many strategies that can help enhance effective assessment, academic integrity, critical thinking, writing skills, and productivity in teaching and learning that are relevant to the challenges and positive potential of AI language generation tools. CTL, along with other campus partners across schools and units, is currently developing resources to support faculty and instructors that will be available here on the Teaching Commons website.
Explore related resources
Stanford faculty and researchers are already actively involved in the broader dialogue on AI and its relationship to learning:
- Faculty from the Stanford Accelerator for Learning are investigating AI and its implications for learning.
- Colleagues at the Stanford Institute for Human-centered Artificial Intelligence are deeply engaged in advancing AI to improve the human condition.
- The Stanford Artificial Intelligence Lab is continuing to support research into AI-powered solutions.
- Stanford Graduate School of Education (GSE) Office of Innovation and Technology offers insightful perspectives on its blog:
In higher education
- ChatGPT: Implications for Teaching and Student Learning, From the CRLT Blog. Center for Research on Learning and Teaching, University of Michigan.
- ChatGPT and the rise of AI writers: how should higher education respond?, Times Higher Education. Nancy Gleason.
- Eight ways to engage with AI writers in higher education. Times Higher Education, Lucinda McKnight
- AI bot ChatGPT writes smart essays — should professors worry?. Nature. Chris Stokel-Walker.
- Education in the World of ChatGPT. The Absent-Minded Professor Blog. Josh Brake.
- Three Things to Know about AI Tools and Teaching, Agile Learning Blog, Derek Bruff
- How About We Put Learning at the Center?, Inside Higher Ed, John Warner