Office of the Provost–Division of Faculty Success
Trinity Hall 106, 800 Greek Row Dr., Box 19128
The University of Texas at Arlington
Arlington, TX 76019
Phone: 817-272-7464 | Email: CRTLE@uta.edu
How Faculty are Using AI in Teaching
Faculty who presented at the CRTLE AI Course Redesign Institutes (August 2025 and October 2025) shared concrete, classroom‑tested ways they are using AI to support learning rather than replace it. Instructors described designing assignments where students use AI to generate study question banks from course content, training discipline‑specific models on students’ own work to support creative iteration in design studios and using AI‑supported feedback tools to help students reflect on performance and improve skills. Others highlighted using generative tools to refine assignment prompts, build rubrics, and scaffold complex projects while maintaining clear expectations for transparency and ethical use. Across disciplines, these faculty emphasized that AI works best when it is intentionally framed—as a collaborator that helps students practice analysis, reflection, and problem‑solving, while keeping human judgment, disciplinary knowledge, and learning goals firmly at the center.
AI Teaching Quotes
In my courses on responsible AI and the future of work, I design assignments where students use AI tools transparently to support tasks like drafting, revising, or analyzing scenarios. But they also have to articulate their own ethical framework for how and why they used those tools. The emphasis is on critical reflection and accountability, so AI use strengthens students’ thinking rather than replacing it. I design my courses with the assumption that students will use AI, so instead of asking ‘Did they use it?’ I ask, ‘What kind of thinking does this assignment require?’ I use AI in teaching by building assignments that require personal judgment, ethical reasoning, and reflection—things AI can’t do on its own. Students might use AI to structure writing or check clarity, but the core of the work has to be human. I’ve shifted my feedback away from grammar and toward meaning, because AI can already handle surface-level corrections. In my teaching, AI becomes a collaborator that students must evaluate critically, not a shortcut that replaces learning.
When I introduce tools like Teachable Machine, the goal is not to turn students into programmers but to help them understand how machine learning actually works. Students train simple models themselves using images, sounds, or poses, and that hands on process helps demystify AI. It becomes a way to talk about data quality, bias, and limitations while students are actively creating and testing models in class. In my teaching, I use AI to make machine learning concepts tangible for students who may not have a technical background. I rely on hands-on tools like Teachable Machine so students can actually train models themselves and see how data, bias, and accuracy work in practice. The goal isn’t to turn students into AI specialists, but to help them understand how AI systems learn, where they fail, and why ethical judgment matters. I use AI in teaching as a way to demystify the technology and empower students across disciplines to engage with it thoughtfully and responsibly.
I love using AI for image and graphic creation in my courses, especially when I’m preparing materials quickly. Whenever I have AI generate a graphic, I also prompt it to create alt text and check color contrast, so accessibility is built in from the start rather than added later. That combination saves time while still keeping my materials aligned with inclusive teaching practices. I use generative AI tools like Copilot to refine case studies, assignment directions, and rubrics. I’ll draft the assignment myself, then ask the AI what might be unclear to students or how the directions could be more concise. It helps me anticipate student questions and tighten my instructional design before the course even begins. One way I use AI directly with students is through simulated role plays. I’ve had students work with a generative AI to simulate a mental health crisis text line, which allows them to practice communication and decision making skills in a low risk environment before working with real clients. This builds confidence and professional readiness.
I teach an intro level programming course, and one thing I’m going to implement this coming semester is having students create a problem bank using AI. They already put homework questions into tools like ChatGPT, so instead of pretending that doesn’t happen, I want to channel it. Students will use AI to help generate and organize questions they need to study for exams and homework, and then we’ll build that question bank together as a class. That way, AI becomes part of how they learn to study and prepare, not just something hidden or misused.
I was honestly a reluctant AI adopter at first because I wanted to hear students’ ideas, not what AI was creating for them. What changed my perspective was realizing that students and staff were using AI to organize and clarify their thinking, not replace it. In my teaching and career preparation work, we now show students how AI can be used ethically for resumes, interview preparation, and professional communication. We’re very explicit that AI should enhance their voice, not speak for them. The way I use AI in teaching is to make the invisible systems of hiring visible, so students understand how AI is actually being used in the workplace and how to engage with it responsibly.
I regularly use AI for generating ideas, brainstorming, revising, and editing instructional materials. For me, it functions as a problem-solving partner during course design, helping me think through options and refine materials before they ever reach students. I’ve been explicit with students when AI is used in instructional materials, even using the phrase ‘co-authored with AI.’ Being transparent about that process opens up important conversations about authorship, ethics, and how these tools can responsibly support learning. When we talk about using AI in teaching, we’re really talking about intentional integration rather than experimentation for its own sake. I encourage faculty to start by deciding on a clear policy—allowed, limited, or prohibited—and then building assignments that align with that choice. We’re not doing the Wild West with AI, but we are allowing room for exploration within guardrails. I see AI as a way to reduce friction in teaching tasks, support creativity, and open up space for deeper learning, while still keeping faculty and students safe, ethical, and legally protected.
One of the things we’re actively working on is how students acknowledge AI use in their coursework. There’s an AI citation module drafted for Canvas, and the idea is to make expectations explicit rather than punitive. If students are using AI tools as part of their learning process, we want them to document that use clearly and thoughtfully, so faculty can focus on evaluating learning rather than policing tools. In my teaching, I use AI both as a practical tool and as an object of critique. Students work with large language models to understand what they can do well, but we spend just as much time analyzing their limitations, biases, and economic implications. Teaching with AI means helping students recognize that these systems generate language without understanding, and that matters deeply for scholarship, translation, and knowledge production. My goal is not to discourage AI use, but to cultivate informed skepticism and responsible engagement so students can work with these tools without surrendering intellectual judgment.
I use AI in teaching as a creative partner rather than a design replacement. In my studios, students work with generative models to explore architectural form, sustainability strategies, and spatial possibilities, but always within a structured workflow that preserves their design agency. AI helps students iterate faster and visualize ideas they might not reach on their own, while evaluation tools like daylight simulation ensure the designs remain grounded in performance and ethics. My approach to teaching with AI is about choreographing intelligence—human and artificial together—so students learn how to guide AI instead of being led by it.
I used AI supported policy examples to help me draft my syllabus language, and I was transparent with my students about how I did it. I didn’t copy and paste, but I brainstormed from existing examples and adapted them to fit my course. That process helped me be clearer and more intentional about expectations around AI use.