What Are You Curious About?
By Lew Ludwig
What are you curious about?
This is not a question we often pose to students on the first day of class. Typically, I might outline the overall goals of the course: to develop analytical abilities, hone problem-solving skills, and learn to communicate arguments clearly and succinctly. If pressed, I could list specific learning objectives, such as the ability to interpret the derivative of a function at a point, define and interpret congruences, or determine eigenvalues and eigenvectors. But what if we began by asking students not just what they hope to learn, but also how they envision applying these skills to shape their world?
Navigating Uncharted Waters
Embarking on this AI adventure with my students this semester, I find myself in uncharted waters. There exists no "AI canon" to reference, no seminal texts to guide us. At the semester's outset, my "intern" – ChatGPT 4.0 – assisted in crafting a 14-week syllabus covering topics such as prompt engineering, AI in data analysis, and the ethics of AI. However, three weeks in, I've come to recognize the shortcomings of this approach. Unlike the math courses where I possess a certain level of expertise, in the realm of AI, my students collectively probably know as much or more than I do when it comes to AI. This realization presents a unique opportunity to transform them into co-creators of the course. It's time to embrace this shift and explore what sparks their curiosity about AI.
Where are the students?
This realization became clear to me during a recent meeting. Like many academic institutions, mine hosts an "AI working group" comprised of administrators and faculty members. Formed in late July 2023, our eight-member team was tasked with addressing the growing concern of AI on campus. To date, the group has conducted two campus-wide sessions to help faculty navigate this new landscape.
However, students have not been included in these discussions—a situation that must change. As we agonize over the potential challenges AI presents—pondering whether it necessitates a complete overhaul of our assessments—we overlook the crucial perspective of those who share the classroom with us: our students.
Know your audience
Armed with this new insight, I decided to deviate from the planned topic of prompt engineering in our last class, to try to better understand my students' needs and concerns. An early course survey had already revealed that all but one student agreed or strongly agreed with the statement, “I would like to understand more about how AI works and how to use it effectively.” Additionally, 70% of the students reported having only a moderate understanding or less of AI, and 20 students agreed or strongly agreed that “AI is likely to play an important role in my career.” But I wanted to know more—what specifically intrigued them? What were they curious about?
Your audience can teach you things
After a few prompting questions and hastily generated anonymous polls, here is what I found. Students want a clearer understanding of faculty expectations and the reasons behind them. Collectively, my students were enrolled in 87 different classes. In 31% of these classes, the use of AI was prohibited. Meanwhile, in 23% of the classes, there was either no guidance provided on AI usage, or, more accurately, students were unaware of any existing AI policies.
Take away number one - talk to your students
Talk to your students about AI. If AI usage is not permitted, clarify your reasons. Conversely, if it is allowed, share your rationale. Revisit these conversations periodically. The science of learning underscores the value of distributed practice, and the same principle applies here. Maintaining an open dialogue with students about AI is crucial. Ryan Watkins' post offers an excellent guide for facilitating these discussions.
Take away number two - our students share our concerns
When I asked my students what they were curious about with regard to AI, their responses ranged from the expected—such as "What can it do?" and "How does it work?"—to concerns that hit closer to home, like "Is using AI considered plagiarism?" and "How many jobs will AI replace? Will my degree still hold value in an AI-dominated world?"
Sound familiar? By asking them about AI, I discovered that my students share many of the same concerns we do. While we fret about potential academic integrity violations, which are valid concerns, we overlook the broader picture. Our students are also anxious about the impact of AI on society. With this understanding, shouldn't we collaborate with our students to navigate this new AI-driven world together? It's crucial to recognize that no single person is the expert; rather, the collective wisdom of the room prevails. We must face this challenge as a united front.
Looking ahead
You might be wondering, "But what about the calculus?" This week marks a significant milestone as my students write their first calculus test for me (see The Cheat Proof Calculus Exam). For this assignment, not only do they create the test, but they have to create a solution key that other students in the class would understand. Why is it that the limit of f(x) as x approaches 3 does not exist? Their struggle with articulating clear explanations sets the stage for our next endeavor: leveraging generative AI to refine their explanations. Given the graphical nature of these questions, it promises to be a true test of AI's capabilities. My hope is that it will take a true human-AI collaboration to improve these explanations. Stay tuned for our next post, where we'll dive into the outcomes of this experiment. Your support and curiosity fuel our journey—wish us luck!
Lew Ludwig is a professor of mathematics and the Director of the Center for Learning and Teaching at Denison University. An active member of the MAA, he recently served on the project team for the MAA Instructional Practices Guide and was the creator and senior editor of the MAA’s former Teaching Tidbits blog.