Teaching in the AI Era

The recent growth in generative artificial intelligence (AI) tools has quickly changed the education landscape. Generative AI is a type of artificial intelligence technology that can generate new content (such as text, images, audio, computer code, etc.) based on patterns and information gathered from sets of sample materials. 

Below are some suggestions from the TLHub to help you be on the forefront of these tools and their impact on your course. Please note that these are emerging tools, and we will continue to provide new and updated resources as we develop our guidance on the intersection of teaching and AI.

Top Tips

  • Review the GSB’s policy on student AI use for coursework. Similar to the use of the internet, instructors may not ban AI use for take-home coursework. See more in our FAQs on Setting Course Policies on the Use of Generative AI Tools below.
  • Adjust your assignments and other coursework if necessary, using your course goals as a guide. See our tips below in the FAQs on Designing Coursework in the AI Era.
  • Access the GSB’s chatbot tool. We now have a private, secure, Stanford-approved chatbot for GSB faculty and staff. This tool, called GSB-GPT, is powered by ChatGPT and is approved for use with low and medium risk data. Learn more in our FAQs on Using GSB-GPT and Other AI Tools below.
  • Hear ideas from the GSB community. Watch the recording of Faculty Voices: Exploring AI in the GSB Classroom to learn how GSB faculty are exploring AI in course assignments and activities. Review the event highlights of Student Voices: Exploring AI in the GSB Classroom to hear GSB students’ experiences and perspectives on AI in the classroom.
  • Establish and communicate your course policy on AI (view our template statements). State your policy in your course syllabus and the Course Policies and Norms form, and then follow up in class with your students. 
  • Get to know generative AI tools and how they might emerge in your course context. Consider running a few of your course’s assignment(s) through GSB-GPT (recommended) or another AI tool (see note on information security below). Discuss your findings with your students and explain how the AI tool’s output compares to your expectations for their work. 

Not sure where to start? See our tips for Starting Small with AI in the Classroom.

A Note on Information Security

GSB-GPT is the preferred generative AI chatbot tool for GSB faculty and staff.  This secure tool is approved for use with low and moderate risk data. Learn more about GSB-GPT and how it can be used in the classroom in the FAQs on Using GSB-GPT and Other AI Tools below.

When using third-party generative AI tools that Stanford does not support, follow UIT’s resource on Responsible AI at Stanford. Such tools should be used only for low risk data that can be made public. When using these tools, Stanford users should avoid inputting materials containing students’ personal information and proprietary or copyrighted materials (which may include case studies, course assignments, data sets, and more), or any information that may be classified as moderate or high risk according to Stanford UIT’s risk classifications. Information you input into these tools may be shared with third parties, and the tool may use your prompts or questions to inform content generated for other users.

GSB-GPT is not widely available to GSB students, and Stanford does not yet support other AI tools for student use. If you encourage students to use such tools for work in your course, you must offer alternatives for students who do not wish to use AI tools. Learn more in the FAQs on Incorporating AI Tools into Coursework below.

FAQs about Generative AI Tools in the Classroom

Setting Course Policies on the Use of Generative AI Tools

The GSB has shared the following policy about instructors’ approach to student use of AI for coursework:

The use of ChatGPT and other AI tools for any take-home assignment is similar to the use of the internet. Just as instructors cannot ban internet use for take-home exams or other assignments completed outside of the classroom due to “undue temptation,” the GSB uses the same policy for AI on take-home work. Instructors may only ban AI tools for in-class exams. 

Wondering how to plan your assignments accordingly? See the FAQs on Designing Coursework in the AI Era.

State your policy in your course syllabus and the Course Policies and Norms form, and then follow up in class with your students. 

For syllabus statement templates and tips on setting your course approach to AI use, see our resource on Course Policies on Generative AI Use

Follow UIT’s guidelines on Responsible AI at Stanford if you (or your students) use third-party generative AI tools that Stanford does not support (such as ChatGPT, Microsoft Copilot, Google Gemini, or Claude). Such tools should be used only for low risk data that can be made public. For example, when using these tools, Stanford users should avoid inputting materials containing students’ personal information and proprietary or copyrighted materials (which may include case studies, course assignments, data sets, and more).

Faculty and staff may use GSB-GPT, the GSB’s secure generative AI chatbot tool approved for use with low and moderate risk data. Learn more about GSB-GPT and how it can be used in the classroom in the FAQs on Using GSB-GPT and Other AI Tools below.

Designing Coursework in the AI Era

Tip: Consider attending a Teaching and Learning Hub workshop for GSB instructors on Teaching with AI.

Consider your course goals to inform how to guide students on activities, assignments, and exams. Choose one goal and consider the following questions:

  • What do you want your students to be able to do or understand by the end of the course?
  • How will students demonstrate their skills or understanding to you?
  • Is it important that students are able to demonstrate those skills or knowledge without using AI tools (or the internet) in the moment? Why or why not?

If it is important that students demonstrate skills or knowledge without using AI tools or the internet, consider the following:

  • Conduct an in-class activity or assessment during which students do not use AI tools. This could include an exam, presentation, role-play, or discussion.
  • Provide students with clear instructions of how to complete a take-home assignment and explain why the method you recommend is valuable for their learning. This will not prohibit AI use, but guide students’ approach to learning a skill or information.
  • Connect take-home assignments (that may use AI tools) to in-class activities (without using AI tools). For example:
    • Pair projects with an in-class presentation or poster session in which students must discuss and defend their approaches. 
    • Before assigning an essay on a general topic for homework, have students work in small groups to paraphrase the main takeaways of a class session and develop predictions for how those ideas will apply to upcoming course material.
    • Have students practice sample problems in class before assigning a take-home problem set.

If students may use AI tools or the internet when demonstrating certain skills or knowledge, let students know. Make sure students know what your expectations and standards are for an assignment or exam. If you would like them to use AI tools (or other supports) in a particular way, provide students with clear instructions.

Explain your choices about AI use to students. Connect your choices or recommendations to the learning goals of the course, industry standards (e.g., publication standards), or common industry cases (e.g., needing to give thoughtful/persuasive/clear input on the spot during board meetings).

As AI tool use becomes more prevalent, how can we help students avoid becoming over-reliant on AI tools? Here are five tips to promote students’ deep learning and thinking in the AI era:

Celebrate struggle. Genuine learning is strenuous rather than effortless. When asking students during an in-class activity or exam to struggle with a task or concept without the help of AI, explain why this struggle is productive and what real-world context this challenging work is preparing students for.

Engage authentically with AI. 

  • Lead a discussion about affordances and challenges of AI in your field.
  • Ask a guest (or students) to discuss their work experiences with AI or AI tools.
  • Challenge students to experiment with how they might use AI for different types of work.
  • Challenge students to reflect on situations in which they shouldn’t rely on AI. 

Slow down. Have students stop, slow down, and reflect during coursework. 

  • Break assignments into parts.
  • Ask students to complete AI-assisted work step-by-step.
  • Incorporate periodic reflection points into coursework, through writing or small group discussion.

Provide in-class demonstrations and practice. Students are more likely to execute skills or process content in a specific way (whether with or without AI tools) if the instructor has demonstrated this in class. If this is important in your course, demonstrate the skill or process in class at least once, then give students a chance to practice and ask questions.

Start with human ideas. Provide opportunities for students to think deeply about a problem, before leaning on AI and other supports as they complete coursework. 

  • Give students 5 to 10 minutes in class to brainstorm approaches or ideas before starting work on an essay, project, or problem. 
  • Arrange peer review feedback sessions on drafts of written work or presentations.
  • Ask students to reflect on connections between theory or frameworks and in-class experiential learning tasks or discussions.
  • Ask students to apply course concepts to an issue of their choice that they are passionate about or that is related to their work experiences.

Looking for more? GSB students provided their thoughts on this question during a student panel event on AI in the classroom. Read the highlights or watch the recording.

Incorporating AI Tools into Coursework

The GSB has provided the following guidelines for incorporating student use of AI for coursework:

When incorporating AI tools into your coursework, please consider student privacy, Stanford data policies, and equitable access. 

See additional FAQs in this section for tips on these considerations. 

See also our resource on Course Policies on Generative AI Use for sample statements sharing relevant Stanford data policies with students.

Stanford does not yet support generative AI tools for student use and GSB-GPT is not widely available to GSB students. 

Students may use commercially available AI tools for their coursework, as long as they follow the guidelines provided in Stanford IT’s resource on Responsible AI at Stanford. For sample syllabus statements to share relevant Stanford data policies with students, see our resource on Course Policies on Generative AI Use.

GSB Digital Solutions is currently piloting custom course bots designed for student use in classes. If you are interested in learning more, please reach out to DS at gsbdigitalsolutions@stanford.edu.

If you would like to use a tool in your course that is not currently approved, see the adoption process for new technologies for GSB courses.

If you encourage students to use AI tools (that are not Stanford-supported) for work in your course, please consider the following guidelines.

Instructors should:

  • Recommend AI tools that do not require students to register or sign up for an account, if at all possible.
  • Offer assignment alternatives for students who do not wish to use AI tools. This especially applies when asking students to use paid tools or, to use tools that require personal information to make an account, or to complete assignments discussing personal information. 
  • Clearly state on the syllabus and course description, and clarify on the first day of class, if you will ask students to use a certain AI tool and that students may complete alternatives without penalty (i.e., if they do not wish to make an account or do not feel comfortable with the terms of use of the tool).

Students should:

You may also wish to consider the following suggestions:

  • Provide specific instructions for students when asking them to use AI in assignments and/or demonstrate AI tool use for students. The more students are guided in their use of AI, the more likely they are to use the tools to support rather than shortcut their learning.
  • Students should learn about the limitations of generative AI tools. For example, AI tools do not always provide correct responses (often called ‘hallucinations’) and the tools’ outputs can be biased (e.g., more often representing a doctor as male and a nurse as female). For a sample syllabus statement on the need to critically evaluate AI-generated information, see our resource on Course Policies on Generative AI Use.

Here are some ways you might support students with different levels of AI tool experience:

  • Demonstrate in class how you might approach using AI tools for an assignment or task, or record a video demonstration to share with students.
  • Lead an open discussion or brainstorm with students on how they could imagine using AI tools for a particular task.
  • Ask students to use AI tools during an early group assignment so they can learn from one another.
  • Share resources with tips for using chatbots. Here are some quick tips for chatbot prompting you can adapt and share with students.

In a student panel discussion on exploring AI in the classroom, GSB students reported exploring or engaging with AI tools for tasks like: 

  • Summarizing material (developing key takeaways, asking questions about a reading, etc.)
  • Performing creative tasks (generating image, video, or data visualizations; writing a children’s story, etc.)
  • Getting writing support 
    • Generating or brainstorming ideas ahead of writing (i.e. using AI to overcome writer’s block, brainstorming, or having a thought partner)
    • Using AI to refine the writing after first developing their own outlines, reflections, or ideas (i.e. as a writing mechanics assistant)
  • Drafting potential talking points for a discussion
  • Gathering on-demand feedback (i.e., to check whether an answer they’ve developed is correct or needs improvement)
  • Performing information searches (e.g., Perplexity.AI)
  • Drafting and troubleshooting code (e.g., GitHub CoPilot)
  • Compiling meeting notes (e.g., Otter.AI)

Encourage students to critically evaluate all AI-generated content, no matter how they use the tools, and refer students to your AI use policy to let them know how and/or when to report their AI use.

For more, review the event highlights or watch the recording of Student Voices: Exploring AI in the GSB Classroom.

Watch the recording of Faculty Voices: Exploring AI in the GSB Classroom to learn how GSB faculty are exploring AI in course assignments and activities (Note: GSB faculty panel begins at 30:26). The panel features the following faculty:

  • Justin Berg, Former Assistant Professor of Organizational Behavior
  • Jennifer Aaker, The General Atlantic Professor in Marketing
  • Glenn Kramon, Lecturer in Management
  • Dan Iancu, Associate Professor of Operations, Information & Technology
  • Julien Clement, Assistant Professor of Organizational Behavior

View our resource on Starting Small with AI in the Classroom, in which we offer tips on:

  • How to test your course assignments with AI tools
  • Reframing existing coursework to incorporate AI use thoughtfully
  • Short in-class activities to bring AI into the classroom in 10 minutes or less
  • And more!

Using GSB-GPT and Other AI Tools

The GSB now has a private, secure, Stanford-approved instance of ChatGPT for GSB faculty and staff use (courtesy of Digital Solutions). This secure, Stanford-approved tool, called GSB-GPT, can be used with low and moderate risk content to support both your teaching and research. Please read these guidelines for additional information on privacy and policy considerations.

GSB-GPT is…

  • Powered by ChatGPT 4.0 Turbo.
  • Hosted securely to ensure GSB data privacy and protection requirements. Data that is input into GSB-GPT is not used by the large language model (LLM) provider for training purposes and cannot be accessed by other users.
  • Approved for use with low and moderate risk data according to Stanford UIT’s security classifications.
  • Provided at no cost to faculty and staff.
  • Accessible using your SUNet credentials.

GSB-GPT is safe to use with course and student materials containing low and moderate risk data.

Instructor-generated materials. GSB-GPT may be used with materials such as:

  • Exams 
  • Assignments
  • Lecture notes
  • Slides

⚠️ Student-generated materials, including completed assignments and other coursework or student survey and evaluation responses. These materials have restrictions on how they may be used with GSB-GPT:

  • Instructors may input student content for uses such as generating takeaways about student learning or highlighting gaps in understanding, but Stanford discourages using AI for grading at this time.

⚠️ Published and proprietary materials, including case studies and data sets. These materials have restrictions on how they may be used with GSB-GPT:

  • Some published and proprietary materials, including some library materials or research data, may not be used with AI tools. Please consult with the Business Library for specific usage guidelines.
  • Any use of published or proprietary materials in the classroom, including when using AI tools, should align with fair use principles. Please consult with Stanford’s Copyright and Fair Use Center for further information on specific classroom use cases.

High-risk data. GSB-GPT is not safe for any data classified as high risk according to Stanford UIT’s risk classifications.

GSB faculty and staff may use third-party AI tools that are not Stanford supported according to UIT’s guidelines on Responsible AI at Stanford, but only for low risk data that can be made public. When using these tools, faculty and staff should avoid inputting materials containing students’ personal information and proprietary or copyrighted materials (which may include case studies, course assignments, data sets, and more), or any information that may be classified as moderate or high risk according to Stanford UIT’s risk classifications.

GSB-GPT is not widely available to GSB students and Stanford does not yet support generative AI tools for student use. Students are allowed to use AI tools for coursework, with limitations.

For more information, see the FAQ above, “What AI tools are students allowed to use?” in Incorporating AI Tools into Coursework.

AI Tools and Academic Honesty

Under the honor code, faculty may use AI detection software if they provide students with “clear, advance notice,” as in the Office of Community Standard’s Tips for Faculty & TAs

If you are considering using AI detection software, consider the following guidelines:

  • AI detection software cannot be used as a deterrent for student AI use in take-home coursework, since GSB instructors may not ban student AI use for take-home coursework.
  • Stanford does not have approved general licenses for plagiarism monitoring tools (a common example is Turnitin). We recommend faculty use AI detector tools with a high degree of caution, if they choose to use such software, because AI detector tools can vary widely in their accuracy and can produce both false negatives and false positives. Users can also bypass detection tools by revising AI-generated text and testing their work in AI detector tools themselves to check detection likelihood.
  • Each AI detection tool differs in its approach to privacy and how inputted material may be shared with third parties. If you choose to use an AI detection tool, avoid inputting information that should not be made public, according to UIT’s guidelines for Responsible AI at Stanford. This includes personal, sensitive, confidential, or proprietary information that may be contained within students’ coursework, including student names.

You may ask your students to cite some or all of their use of AI-generated content. (See our resource on Course Policies on Generative AI Use for syllabus statement templates and tips on setting your course approach to AI use.)

If you are concerned about student use of AI-generated content that is not properly cited, consider the following:

  • Test your course assignments using one or several AI tools. This will provide the clearest sense of how an AI-generated output may compare to high-quality student work in your course. Note that such tools do not generally produce the same output each time a question is posed.
    • Note: When testing assignments with AI tools, we recommend using GSB-GPT. Be sure to choose assignment prompts and materials that do not include high risk data. When using a third-party tool that is not Stanford supported, be sure to choose assignments that contain only low risk data that may be made public.
  • Some common strategies for detecting plagiarism may be useful for identifying improperly cited use of AI tools. These include looking for generic or repetitive language, large shifts in writing voice or style, improper and missing citations or facts, or if a student can’t explain how they arrived at an answer or made choices in producing an assignment.
  • AI detection software is available, but has significant limitations. See above for more.

There are some methods you can use to minimize students using learning shortcuts when completing assignments. For design practices that help improve learning and encourage deep student engagement, see the FAQs on Designing Coursework in the AI Era.

Getting Started with AI

Generative AI is a type of artificial intelligence technology that can generate new content based on patterns and information gathered from sets of sample materials. Here’s what else to know about this technology:

  • The most well-known generative AI tools are ‘chatbots’ (such as GSB-GPT; and commercial tools, e.g., ChatGPT, Microsoft Copilot, Google Gemini, or Claude). These are AI-based text generation tools that users interact with using a chat interface. Users type a question or prompt, and the chatbot responds with a cohesive and creative written answer. The chatbot can then refine or adjust its responses based on how the user continues to interact with the tool, such as by continuing to pose questions or asking for changes to the chatbot’s output.
  • Other generative AI tools produce many different outputs such as images (e.g., DALL-E or Midjourney), code (e.g., GitHub CoPilot), data analysis or visualization (e.g., ChatGPT Code Interpreter), or internet search results (e.g., Perplexity).
  • AI tools use ‘training data’ (sample materials or data) to inform the content they generate. The quality of AI-generated content can vary widely, in part based on the quality or focus of the training data. For example, a tool trained on large amounts of internet content may be able to respond to a prompt in many different and creative ways, but the tool’s output may also reflect the varying accuracy or biases embedded within the sample internet content. In addition, AI tools sometimes prioritize completing or responding to a user’s prompt over accuracy of information, and can sometimes make up information entirely (known as ‘hallucinating’).
  • For definitions of common AI terms and additional explanations of how AI chatbots work, see Stanford Teaching Commons’ resource on defining AI and chatbots.

Test out your course’s assignment(s) using one or several AI tools. Discuss your findings with your students and explain how the AI tool’s output compares to your expectations for their work. See tips and suggestions for testing your course assignments with AI tools.

Note: When testing assignments with AI tools, we recommend using GSB-GPT. Be sure to choose assignment prompts and materials that do not include high risk data. When using a third-party tool that is not Stanford supported, be sure to choose assignments that contain only low risk data that may be made public.

Learn more about AI tools’ strengths and limitations with five fun exercises. We recommend choosing creative topics and/or topics that you are an expert in (whether that is a business, teaching, or other topic), for the best sense of the scope and limitations of the AI tool. Select your chatbot tool of choice (e.g., ChatGPT, Microsoft Copilot, Google Gemini, Claude) and try one or more of the following:

  • Stump the AI. Select a topic that you are an expert in and prompt the chatbot for increasingly specific or obscure information. 
  • Brainstorm creative writing. Ask the chatbot to help you develop a fictional short story, and explore different elements of the story through conversation with the chatbot.
  • Practice a foreign language. Prompt the chatbot to be your supportive language teacher in a foreign language of your choice and ask it to help you with vocabulary practice for travel in a relevant country. 
  • Develop a project plan. Ask the chatbot to develop a project plan for an upcoming project or an itinerary for a trip, and then ask for further details, reorganizations, or sample products.
  • Try something fun. Create a recipe for a new fusion dish. Plan a themed surprise party for a special guest of honor. Play a guessing game about animals, movie stars, or another topic. Provide song recommendations based on your creative descriptions. 

These and additional sample prompts are provided in Stanford Teaching Commons’ resource on exploring the pedagogical uses of AI chatbots.

For a step-by-step guide to generative AI tools in the higher education context, see Stanford Teaching Commons’ Artificial Intelligence Teaching Guide.

TLHub Support

We can offer one-on-one consultations, answers to questions via email, and even a short presentation with Q&A at your next faculty meeting. If you have questions about the impact of generative AI tools on your course and assignments, reach out to discuss your concerns or specific use case with us.

Additional Resources

Stanford Community Resources

Resources from Outside Institutions

Acknowledgements

This article draws from the Artificial Intelligence Teaching Guide, Stanford Teaching Commons.

Important Note:

Technology is changing at a rapid pace. While we make every attempt to ensure our content is updated to reflect changes to the interface and functionality, we can only guarantee the accuracy of the content on this resource page when it was written or recorded. Please be sure to check the software developer's website for the latest updates and release notes for the most up to date information. If you have questions or concerns, or need additional support, please contact us.