Generative AI and Writing-Intensive Courses

This page was guest-written by Phil Shaw, Senior Lecturer, University Writing Program, and CTL Faculty Fellow for Student Success in Gateway Courses.

Our understanding and use of Generative AI programs like ChatGPT, Microsoft Copilot, Midjourney, and an increasing number of free and subscription tools has expanded significantly since December 2022. At that time, faculty questions about this new, suddenly available technology came from a place of genuine surprise and apprehension. What is this? How will students use it? How will I recognize the difference between student work and what has been created using GenAI? Those questions are still relevant, especially as access to this technology continues to expand.

RIT has yet to establish university-wide academic integrity guidelines for the use of generative AI in coursework, leaving pedagogical and assessment questions up to departments and individual instructors. For this article, we are focused on our RIT context of First-year Writing (FYW) and Writing-Intensive (WI) instructors who are interested in exploring theoretical frameworks for the use of generative AI In their classrooms. Those of us in the Center for Teaching and Learning believe that the use of these tools in classrooms is not simply an inevitably to be addressed but rather an opportunity to explore the possibilities of the technology while being mindful of risks to academic integrity, intellectual property, and student learning outcomes.

Questions about the implementation of generative AI tools in specific courses can be directed to the Center for Teaching and Learning (CTL) at, and consultation requests can be made to meet with a CTL Teaching Consultant or Faculty Fellow.

The following three theoretical frameworks and considerations are curated to reflect some of the relevant contexts of writing-intensive courses at RIT. These frameworks were chosen because they can be incorporated into existing curricula but ought to be explicitly presented to students as they work with GenAI as a part of their writing process.

Framework #1: Kleiman’s SPACE

“The Embrace AI Tools approach recognizes the strengths and limitations of AI tools and prepares students to use them effectively. This approach may lead to less attention placed on students mastering basic skills like sentence structure, grammar, spelling, and punctuation, allowing students to rely on help from AI for those. It will focus more on students learning to express themselves through developing their own voices; becoming skilled at communicating with different audiences; deepening their appreciation of literature, poetry, non-fiction, and other writing genres; and using writing to further their learning and thinking.”

Kleiman suggests having students do the following steps during the writing process:

  • Set directions for the goals, content, and audience that can be communicated to the AI system. This may, for example, involve writing introductory materials for the overall text and for each section. It could also involve writing much of the text and leaving some sections for AI to complete.
  • Prompt the AI to produce the specific outputs needed. A prompt gives the AI its specific task, and often there will be separate prompts for each section of text. An AI tool can also be prompted to suggest sentences or paragraphs to be embedded in text that is mostly written by the human author.
  • Assess the AI output to validate the information for accuracy, completeness, bias, and writing quality. The results of assessing the generated text will often lead to revising the directions and prompts and having the AI tool generate alternative versions of the text to be used in the next step.
  • Curate the AI-generated text to select what to use and organize it coherently, often working from multiple alternative versions generated by AI along with human written materials.
  • Edit the combined human and AI contributions to the text to produce a well-written document.”

Framework #2: Dobrin’s Four Considerations

Dobrin proposes four ways to think about writing instruction in an age of generative AI (GenAI):

1. GenAI for Invention
“Can we find ways to use GenAI to help students develop their own avenues into a conversation?” (Dobrin, 2023, p. 21). Early in the writing and research process, GenAI can be used to develop and refine research questions, insider search terminology, survey and interview questions for primary research, and background information on a topic area. Encouraging students to prompt AI during the initial stages of a long-term project can be an efficient and effective tool for entering an academic conversation.

2. GenAI for Revision
In a relatively short period of time, GenAI can offer specific feedback to students. That level of specificity is reliant on the students ability to prompt the GenAI effectively, so this is an approach best introduced in a structured classroom setting. As part of a peer review process, it can also encourage collaborative discussion about the uses and limits of the GenAI feedback.

3. GenAI for Critical Thinking
“One of the most prevalent critiques of GenAI in education has been the claim that GenAI will generate a new degree of laziness among students; that if GenAI platforms can do the work for them, students won’t take the time to think about the materials and assignments they’ve been given” (Dobrin, 2023, p. 21). This critique is mostly true only in the absence of explicit discussion and use of GenAI in the classroom. If writing instructors use GenAI’s output as a starting point for discussion, critique, and revision, students’ critical thinking about their writing can be assisted rather than supplanted.

4. GenAI for Research
The accuracy of GenAI’s analyses of information is increasing, especially with access to web-enabled iterations like ChatGPT 4 and Copilot. Primary research uses of GenAI can include thematic transcript analysis and the development of interview or survey questions. For secondary research, search terminology and learning the discourse of subject areas can be valuable and tailored to specific student inquiry areas.

Framework #3: Harvard’s High-Level Principles

Harvard's Derek Bok Center for Teaching and Learning provides the following strategies for encouraging students to do their work authentically to align with the intended outcomes of your learning objectives:

1. "Talk directly and specifically with students about how your assignments are meant to work. Our students are not, by and large, looking for opportunities to cheat or take shortcuts. The vast majority, in fact, are just as concerned to determine the ethical and responsible use of AI as are their instructors. The primary challenge posed by generative AI is not that, in making cheating easy it will, therefore, make it rampant, but rather that its utility will blur the lines for even our most scrupulous students between seeking help or brainstorming ideas, on the one hand, and soliciting an unacceptable degree of assistance, on the other.”

2. “Disaggregate process from product, and render it visible. Now more than ever, we would encourage instructors to ask students to share early stages of their research and writing, in the form of preliminary assignments like project proposals, lists of analytical questions, annotated bibliographies, brief source analysis exercises, draft introductions, etc. Asking students to share their work in progress makes it considerably harder, not to mention less appealing, for students to outsource their thinking and writing to a large language model, as it would require them to forge, convincingly, not one but multiple phases of thinking and drafting.”

3. “Create opportunities for students to reflect on/talk about their work. So long as students imagine that they are submitting their final written work to a single reader (i.e. the instructor), and that said reader will never ask them to elaborate on, defend, or recapitulate their ideas in further conversation, leaning on generative AI might seem like a relatively safe (even victimless) indiscretion. If, however, students realize that they may have many readers—and, moreover, that those readers will ask them many questions about their writing [...] the value proposition of outsourcing all of those decisions to a large language model that won’t be able to help them respond to their readers in the moment becomes much less appealing.”


Whether you encourage the use of AI in writing-intensive courses is a matter of pedagogical choice, and there are some good reasons for being wary of the uncritical adoption of this new GenAI technology. The use of detection software to catch “cheating” is not a foolproof method to keep AI out of the classroom, and banning it outright might, unfortunately, encourage students’ non-transparent use. Partnering with students who are interested in making GenAI part of their research and writing process fosters collaboration and critical thinking about how this new tool changes our values and expectations around writing. But don’t assume that they will know how to use the technology without support and guidance. “In so far as you want to allow, or even encourage, your students to make use of AI to enhance their written work, you’ll want to make sure that they all have equal access to the most useful platform(s), and a fair chance to develop the requisite amount of proficiency in what is now called prompt engineering” (Harvard University, 2023, p. 9).