Teaching and Learning Resource Center

AI Teaching Strategies: Having Conversations with Students

Students meeting in a small group, with their desks in a circle.

Generative AI is an increasingly important topic to discuss in learning environments. The still emerging technology is likely to have transformative effects on higher education, students’ future careers, and society as a whole. As educators, we have a responsibility to approach conversations about generative AI with care and intentionality. 

That said, many of us are still learning about AI and its implications for teaching and learning, and there has been much debate about its potential benefits and pitfalls. Some concerns raised by educators center around AI and academic integrity, but there are unknowns related to privacy, security, and copyright as well. Before we can have meaningful discussions about AI with students—and before we employ AI tools in our teaching—we should familiarize ourselves with these issues. 

This guide will acquaint you with generative AI, address some of the concerns around its educational uses, and provide concrete suggestions for how to openly and effectively discuss AI with your students.

Generative AI and Its Uses

What is generative AI? When is it useful and when can it be problematic? How can we apply AI effectively for learning and for our course-specific class activities?

Generative AI "creates content—including text, images, video and computer code—by identifying patterns in large quantities of training data, and then creating original material that has similar characteristics" (Pasick, 2023). This is accomplished through artificial neural networks, a type of machine learning model roughly modeled on the human brain. Many educators and students are recognizing the potential uses of AI in the classroom. Possibilities range from brainstorming and creating outlines or drafts, to summarizing complicated texts, to aiding students with grammar and language practice. 

Learn more about generative AI and its applications (and limitations) for teaching and learning in AI Considerations for Teaching and Learning, or by exploring the additional resources below. 

Some Concerns about AI

Though we can envision many educational uses for AI, the most prevalent generative AI platforms have raised red flags around privacy, security, and copyright. Platforms like GPT3 and GPT4 base their LLMs by scouring massive amounts of data, primarily via the internet. Even though this material is collected in aggregate, AI tools often—given their wide reach—obtain all kinds of personal data and copyrighted material. How they collect and make use of that data, as well as the information users input into their platforms, is frequently unclear. It’s important to engage our students critically with these issues and take them into account when planning course activities.

Privacy and Security

ChatGPT and other generative AI tools are third-party software systems that can track, collect, and share data from their registered users. Their terms of service are typically vague regarding how user data is applied to continue training their platforms, as material generated or uploaded by users ends up becoming part of the LLMs. For example, OpenAI’s privacy policy claims that it reserves the right to harvest and share user data, but it does not clarify for whom and for what purpose. This lack of transparency brings up potential concerns for users' privacy and security (Caines, 2023).

It’s important to note that most AI platforms have not been evaluated or approved for use in the classroom by Ohio State. Consider this article from the Office of Technology and Digital Innovation (OTDI) about the dangers of using unsupported technologies: Don’t Fall Victim to Clickthrough Agreements. In short, when using many unapproved platforms, including generative AI tools, users may explicitly or implicitly agree to terms set by the companies that allow them to handle users’ data privacy and security in ways to which Ohio State cannot assent on behalf of students and faculty.

Microsoft Copilot and Adobe Firefly are currently the only generative AI tools that have been vetted and approved for use at Ohio State. Copilot is an AI-powered chatbot that draws from public online data, giving you access to better answers and greater efficiency, but with additional security measures in place. Adobe Firefly is a generative AI engine that aims to support and augment your creative work. You can use Firefly to generate and enhance images, edit objects, and more. 

Learn more about approved AI tools at Ohio State. 

UX Tip

Provide AI alternatives

Some students may be uncomfortable creating accounts in and using AI applications, given the privacy and security concerns. When integrating AI into assignments and other learning activities, avoid making AI use a required task. Instead, provide alternative tools or approaches that will support students to achieve the same learning outcomes.

Learn more in AI Teaching Strategies: Transparent Assignment Design. 

Copyright

There is likewise a lack of clarity and consistency around how generative AI platforms use creative material they collect from the internet and from users. Texts generated by language models, for instance, have consistently committed flagrant acts of plagiarism, from direct, word-for-word plagiarism to misrepresenting others’ ideas as their own (Tutella, 2023). Questions regarding the intellectual ownership of AI-generated texts remain contentious, with no resolution in sight. 

Many writers and artists are pushing AI platforms to be more transparent about the kinds of data their LLMs draw on, and to offer compensation to artists and writers when they pull from copyrighted material. Some creators don’t want their work being used to train AI language models. In fact, there is concern that the models could then generate convincing output in the particular style of a writer or artist, as in this case of  "fake" books attributed to an established author.

Beyond moral conundrums around intellectual property, these murky areas have practical impacts for the works students author in your course. Encourage them to err on the side of caution when it comes to claiming outright authorship of anything created by ChatGPT or other AI language systems (McKendrick, 2022). As there is also a lack of clear guidance on how to cite material generated by AI in any of the major citation styles (including MLA and APA), you will need to offer specific guidance on your own expectations. 

Strategies for Discussing AI

Image
A trio of students looking at something on a tablet computer. Their heads are not visible.

Keep in mind foundational student-centered methods for facilitating discussions when planning conversations about AI. Be transparent about your own knowledge and expectations, model and support critical thinking and reflection, and advocate for rigor in student thinking.

  • Be transparent. As instructors, we cannot be expected to be experts in generative AI technologies. But it’s important to be open with students about our knowledge (even if it’s limited) and about our expectations for AI use (or non-use) in our classrooms. 
  • Grow your own knowledge of AI technologies. Dedicate some time to play with ChatGPT, Dall-E, or other generative AI systems on your own, in addition to encouraging students to explore these tools. Use what you learn to inform class activities.
  • Facilitate authentic dialogue about the benefits and limitations of AI. Encourage students to grapple with the advantages, costs, and open ethical questions about AI technology during class discussions, before assignments and activities that use AI, and in post-reflection activities. 
  • Collaborate with students to develop class expectations. As you discuss the benefits and limitations of AI, you might gather students' input on class norms or guidelines for AI use. Providing students some agency and ownership in course policies may help them feel more invested in meeting your expectations for their coursework.
  • Model critical thinking, curiosity, and reflection. What an instructor does may be more important, at times, than what they say. If students see that you are being inquisitive, careful, and ethical in your approach to AI use, they are more likely to engage in a similar manner. You can even “think aloud” as you demonstrate AI-related skills, such as crafting a prompt to input into an AI tool, and pondering the quality of output it generates.

Conversation Starters and Class Activities

Explore the below suggestions for individual, small-group, and whole-group activities you can incorporate into your course. 

Reflect on prior experiences with AI.

Share your own experiences using AI with students. What did you learn? What frustrated you? Model how important reflection and metacognition are to the learning process by taking this as seriously as you would expect your students to take a reflection assignment. Then ask students to reflect on their prior use (or understanding) of the technology. Likely they have already done some exploration with AI or they are curious about it. 

You could structure this activity using small-group discussion, written reflections, online discussion boards, or another approach—the goal is to model your process for engaging with and reflecting upon the technology and encourage students to do the same.

Discuss shared goals for AI knowledge and use.

Consider your goals around AI knowledge and use for the semester. What would you like students to know, do, or value about AI tools? Which aspects of AI (if any) are integral to your specific course or subject area? List out your goals on one side of a two-column chart. 

Then have students share their goals and preferences for AI use in your course—write them in the other column. Compare and contrast the ideas you’ve generated, highlighting common goals and discussing reasons for divergence. It might spur students to share original ideas if you keep your column covered up while you list their ideas—then, reveal your list prior to discussion. 

A Venn Diagram would also be a useful tool for facilitating this discussion, if you want to add your ideas and students' ideas simultaneously, comparing and contrasting as you go.

Co-create class expectations for AI and academic integrity. 

Offering students a chance to weigh in on your course policies for AI use can motivate them to respect and abide by them. Moreover, facilitating dialogue about specific AI uses in your course may help students develop a more sophisticated understanding of ethical and responsible AI use, in general.

You might open the activity by discussing the concept of academic integrity. What does it mean? How does it apply to your specific course activities? Why does it matter? You could display a syllabus statement you have included around integrity, or a university definition or policy. Then have students share their interpretations and practical examples of academic integrity (and misconduct) in action.

Next, put students into small groups to discuss what they believe IS and IS NOT acceptable use of AI for your course (for studying, brainstorming, assignments, and so on). Each group could list out their ideas in a shared note in OneNote or document in OneDrive using Microsoft 365. Have them fill out a three-column chart labeled Acceptable, Not Acceptable, and Unsure.

Then, reconvene as a whole class. As each group presents their suggestions, add them to a shared three-column chart. Gather input from other students and weigh in with your own thoughts as you list ideas. You might need to ask groups to clarify or defend the rationale for some of their contributions and spend extra time to come to consensus on any items in the Unsure column.

Finally, try to hone students' collective ideas into clearly stated expectations for AI use in your course. You might work on that together during class, or use their ideas to inform the wording of your course policy later. If the latter, assure students that you will seriously consider their input; be ready to explain your reasoning if you don't use some of their suggestions. 

Learn more about how to Support Academic Integrity when it comes to generative AI.

Share examples of AI use in your field.

Search online for examples of how instructors in your field are using AI to teach. What do you find beneficial about these examples? What do you find problematic? Discuss the examples with your students. If they understand your position and rationale about AI in relationship to your content area, they can develop a broader understanding of its acceptable use, beyond thoughtlessly following the policy in your syllabus.   

Alternately, apply this to examples of how professionals or scholars in your field are using AI to work and research, with the goal of helping students develop a framework for thinking about the potential productive and ethical uses of AI in their future careers.

You can also turn this activity into an independent or group assignment in which students are tasked to find examples of AI use in your discipline (online or in newspapers, journals, or periodicals, for example). They can then report back to the class on what they discover.

Experiment with AI uses for specific coursework.

First, take some time to input exam questions, essay prompts, or learning activities into a generative AI system like BardChatGPT, or Claude. Experiment with AI systems that are not text-based like DALL·E 2 (which creates images) or sheeter.ai (which creates Microsoft Excel formulas). As you investigate these tools, keep in mind ways your students might employ them to plan, develop, or complete work for your course.

Next, task students with investigating different AI systems they could use to complete proposed course assignments in the syllabus. You might have them list the tools, their uses for specific assignments, and the pros and cons of using them.

Follow these investigations with a class discussion where you can weigh in on what AI activities you would allow or disallow. What approaches to AI use might you encourage? For example, perhaps you are less interested in students creating AI-generated data visualizations (which some systems can now do), than in students using the AI output to support an original argument. Perhaps you support students using AI to brainstorm or generate outlines for written assignments, but not to generate full drafts. Be prepared to explain your reasoning to students.

Brainstorm alternate assignment ideas. 

Consider the reality that many AI detection programs are unreliable. If you hope to promote integrity by limiting students’ use of AI for your coursework, you might engage them in developing alternatives to written assignments.

Ask pairs or small groups of students to brainstorm low- or no-stakes assignments, creative assessments, and other activities that would allow them to demonstrate their learning outside of a final written product. Prompt them to think of types of assignments, mediums, or real-life applications that might be harder to achieve using AI. How could they capture the “process” of creating their work, rather than just the final “product”?

Learn more about developing assignments that use (or limit) AI in AI Considerations for Teaching and Learning and AI Teaching Strategies: Transparent Assignment Design.

Explore concerns around generative AI.

Openly discuss concerns around generative AI to prepare students to effectively—and ethically—employ it in your course and beyond. 

Put students into small groups to examine various AI platforms’ terms of service and experiment with creating AI output. Provide discussion questions that guide them to consider how generative AI tools address a variety of concerns, such as how they do (or do not):

  • Replicate / address bias in the data sets they're trained on 
  • Account for accuracy in the output they generate 
  • Address the collection and use of copyrighted materials
  • Handle the privacy of users or identifiable information in their data sets 
  • Exploit human labor in coding or in the data that they collect 
  • Design their interfaces for accessibility and transparency

Then have students consider how users of generative AI might respond to or remediate these concerns, if possible. For example, if they use generative AI to brainstorm or develop their drafts for an assignment, how might they deal with the potential biases or copyright infringements in the AI output? Helping students unpack concerns around generative AI through hands-on practice will support them to develop a critical capacity for incorporating AI in their writing and other work.

Consider social ramifications of AI.

To further students’ engagement with critical issues surrounding AI, have them envision a future where it is the prominent driving force behind media and social meaning-making. Ask students to imagine what it would be like if most publicly available information found online is created by AI systemssystems that are only accurate about 80% of the time and often pronounce falsehoods with the utmost certainty. Is this a desirable world? Why or why not? How would they suggest we prevent such a future, if we should?

If this reality sounds outlandish to students, have them explore news articles, blogs, or other writing about the proliferation of AI, such as this recent substack article by known AI scholar Gary Marcus. You could also choose to have them read a few articles first so they have fodder for their arguments, particularly if they do not have a lot of prior knowledge around AI.

This activity could be approached as an individual written reflection, small- or whole-group discussion, or as a discussion board prompt.

Model generating—and evaluating—AI output.

Create something new with an AI image generator and display it for the class. Then discuss the potential copyright and intellectual property implications. Pose questions such as: Who owns this image? How was the image created? What source material led to its generation? Could you submit this work for an assignment? To a gallery? Why or why not?

This activity may also provide an interesting avenue to discussing related topics, such as AI and academic integrity.

Facilitate post-assignment reflections.

When designing assignments that use generative AI, it is good practice to include components that ask students to reflect on how they used it, why they used it, and how well it worked for their purposes. Having them reflect on process (in addition to submitting a final product) can give you valuable insight into their learning, while guiding them to become more intentional users of AI.

For example, if students are using AI to create output for a written assignment, have them consider reflection questions like the following:

  • What AI tool did you use? Why?
  • What prompt did you enter to generate your output? How well did that prompt work? Did you have to enter additional prompts? Why?
  • Were you satisfied with the output the AI tool generated? What revisions did you have to make before using the content for this assignment? How does your final product compare to what the AI tool initially generated?
  • Did your AI output include any problematic content, such as bias, misinformation, or plagiarized material? How did you respond to or remediate these problems?
  • Would you use this AI tool again for a similar task? Why or why not?

In addition to submitting their reflections with the assignment, you could have students discuss these questions with peers, whether in small groups or as a whole class.

Stay informed and update your students.

Subscribe to a newsletter about AI developments and keep up an ongoing conversation with students over the course of the term. Discuss any noteworthy news, what you are learning, and how these developments might impact your discipline, its research, your instruction, the workforce, and so on. Curiosity and passion are contagious—share your excitement about the improvements this new technology could bring as well as any honest reservations you may have. Invite students to share their excitement and concerns as well.

Summary

As generative AI proliferates, it's important to address its uses and limitations in your course. While you are likely not an AI expert, familiarizing yourself with its basic functionality, educational benefits, and potential problems can help you have more effective conversations about AI with students. 

Some important issues to consider include how AI platforms protect (or do not protect) the privacy and security of their users' data, as well as how they can infringe upon copyright or commit outright plagiarism. Because of the lack of clarity in many tools' terms of service and the unanswered questions about AI and intellectual property ownership, proceed with caution as you plan and facilitate AI-related activities.

Keep the following approaches in mind when engaging your students with generative AI.

  • Be transparent about your own knowledge of AI, as well as your expectations for AI use in your course. 
  • Grow your own knowledge of AI, and use what you learn to inform class activities. 
  • Facilitate authentic dialogue about AI's advantages, costs, and unanswered questions, whether for specific course activities, for your field, or for society more broadly.
  • Collaborate with students to develop expectations and policies for AI use in your course or for particular assignments.
  • Model critical thinking, curiosity, and reflection as you use AI tools, and guide students to do the same.

Try out one or more of the conversation starters and class activities above to start critically engaging your students with generative AI.

References