Faculty News

Teaching at Bennington at the Dawn of AI

The Faculty Educational Policies Committee has a statement to help govern students’ use of Artificial Intelligence (AI) in coursework. It is embedded in the policy about academic ethics, which had been, until recently, primarily about plagiarism and attributing others’ work.

It reads, the use of AI “is permitted with the explicit permission of the course faculty member. All AI-assisted work must be disclosed, including the tool used and the nature and extent of its assistance. Failure to disclose AI usage or representing AI-generated content as one’s own original work constitutes academic dishonesty.” 

While faculty members admit that knowing whether students have used AI is getting harder as AI models improve, small classes at Bennington facilitate the discovery of discrepancies between work produced in class and work done independently. 

Katie Montovan, faculty member in Math
Katie Montovan, faculty member in Math

“When I teach programming, I have seen students use completely crazy code that is nowhere close to what I introduced,” said Katie Montovan, faculty member in Math. “Either they found an example online and used it, or they used AI.” 

Faculty members notice that some students sometimes put their reading assignment into an AI to summarize it. In that case, faculty member in Cultural Studies and Languages Noëlle Rouxel-Cubberly said, “it is easy to tell who has done the work and who has not. We are lucky here to have small classes, so we can debate and analyze in the classroom.” 

Most cases of using AI inappropriately are handled between the student and the faculty member. The faculty member asks the student to, for instance, redo the assignment and not to use AI again without permission and attribution. Only a few cases per term are escalated to the Dean’s office. 

Tweaking Assignments

As concerned as faculty members are with catching students using AI inappropriately, they are even more interested in creating an environment that helps students learn the dangers of AI and how to use it as a tool that enhances, rather than inhibits, learning.  

“We are all now challenged to rethink how we teach and what we ask students to do. What is the work that allows them to present their own learning?” said Noelle Murphy, Dean of Studies. “It can’t just be about ‘what can we do to keep students from using this tool that is now pervasive in society.’ It has to be ‘how can it be incorporated in a way that is meaningful and can actually help advance their learning and is true to the way we teach and learn here.’” 

Faculty members are using some old-fashioned methods to discourage students from using AI to skip the challenging parts of their work. Alex Creighton, faculty member in Critical Writing, prompts his class to make a “community agreement” that helps provide guidelines for all class expectations, including the use of AI.

“We create rules and community norms together, and the use of AI factors in,” said Creighton. “We discuss the ethics of writing and plagiarism. Citation is there so we have a paper trail. That’s how scholars build a conversation and advance learning together. If citations disappear, the conversation disappears. AI doesn’t support that.”

Faculty members also ask students to complete different assignments or to do them in different ways. Rather than asking students to analyze a piece of writing or a film as a whole, Creighton asks them to choose a favorite paragraph or 30-second clip and analyze how the details used in that clip or paragraph relate to the themes of the piece as a whole. 

“AI has a much harder time analyzing a small clip or a single paragraph, that may have never been written about in isolation before, than it does an entire essay or film,” he said. These are ways to encourage the development of close reading and critical thinking skills. 

Similarly, Noëlle Rouxel-Cubberly, faculty member in Cultural Studies and Languages, asks students to analyze a film they just watched by handwriting a journal entry from the perspective of one of the characters. Skipping the keyboard altogether and requesting a more emotional, personalized interpretation disincentivizes using AI. She notes hand writing assignments also makes a positive impact on the brain and helps students recall information later. 

Katie Montovan, faculty member in math, relies on conversation and working out problems on paper. 

“Slowing students down really helps them get the sense of how to think about things, and that’s a lot of what we do at Bennington that I don’t see happening at other schools as much.”

On the Frontlines

These are initial experiments in how to deal with this new technology in a higher education context, said Jared Della Rocca, Director of Library Services. 

“Faculty members and administrators are, in many ways, on the front lines,” he said, “both as educators experimenting with the technology and as observers of how quickly students are beginning to adopt it.”

Bennington College faculty members are using some class time to show students how AI introduces bias or erases history. In his writing class, The Scriptorium, Creighton prompts students to compare AI- and human-generated critical writing side by side. He directs students to read both and point out the differences. Among several notable differences, the human scholar noted specific historical events in an analysis of Langston Hughes’s poem “Harlem,” whereas AI generalized. 

“That’s a danger because it is a misrepresentation,” said Creighton. “If I received the AI-generated essay from a student, I would ask, ‘can you be more specific? Did you research using a credible source?’ It’s a teachable moment.” 

Portrait of Jared Della Rocca, director of library services
Jared Della Rocca, director of library services

Students also identified factual inaccuracies and superficiality. The AI-generated piece didn’t make an interpretive argument or present a fresh perspective, the students found.  

“Most students understand that AI is not good enough,” said Creighton. “It is drawing on what exists. It cannot push thought forward. The heart of critical writing is ‘let’s teach something new, something the reader didn’t see in the text before.’” 

Rouxel-Cubberly shows AI’s bias by having it create a poster for a film she and her students are studying, and then analyzing the poster with students. She has found that AI romanticizes and grossly misinterprets the Black experiences in Ousmane Sembène’s film La Noire de …. That is a disturbing revelation to Rouxel-Cubberly, but it highlights the importance of real teaching and learning during this pioneering time. 

“I have taught this film 15 times, and students still come up with new and valid points and interpretations. Students show that there is new discourse,” said Rouxel-Cubberly. “AI reduces. It is good at producing what is expected. What we are interested in is the unexpected. The creative. AI is going to point students in expected directions, and we need to move students in unexpected ways.”

In an experiment meant to highlight the cultural differences between AI from different places, Rouxel-Cubberly asked an AI to “provide words associated with chocolate.” An American AI will provide a consumer’s view: Delicious. Valentine's Day. Lovers… “That is a very western idea of what chocolate is,” she said. “A French AI will view chocolate from a culinary perspective. To the French, chocolate is an ingredient. They are interested in craftsmanship.” 

She notes that words that represent the growers and producers of chocolate, the Ivory Coast, for instance, the number-one producer of cocoa beans, do not appear in either result. 

Even if you ask for the top countries associated with chocolate, you get Belgium and Switzerland, not the countries where chocolate is produced. That is outrageously negative. We need to teach students in the classroom how to interpret what they are receiving and how damaging the outputs might be without critical analysis.” 

Deep Experience

Many faculty members remain hopeful that AI can eventually be used to enhance learning, rather than inhibit it. Rouxel-Cubberly and Yoshida who began teaching at Bennington nearly 30 years ago remember the initial upheaval that came with the introduction of the Internet. 

Ikuko Yoshida, faculty member in Cultural Studies and Languages, before a group with a microphone
Ikuko Yoshida, faculty member in Cultural Studies and Languages,

“Noelle and I have always been interested in using technology in our classrooms. If it can be used to enhance learning and address inequity, we should explore effective ways to implement it,” said Yoshida.

In 1998, Isabelle Kaplan, who was the director of the Regional Center for Languages and Cultures at Bennington College, created News Online for learners of French, German, Italian, Japanese, and other languages. It was a cutting-edge program, the first of its kind. They found a way to leverage something that was believed at the time, to disrupt language learning, and use it to their advantage. They are determined to investigate AI in much the same way. Their refrain: “How can we teach students to use AI as a tool?” 

Their research explores AI’s effect on the fear of reading and writing among introductory-level language students and the integration of virtual reality in educational settings. Funded by a grant from the Sherman Fairchild Foundation, they presented the work at the Northeast Conference on Teaching of Foreign Languages in February. 

“I stress the importance of the learning process,” said Yoshida. “AI [as it is currently configurated] is more focused on providing the end result. If something can give students the end result, they may have completed their work, but they are not learning. We need to train students to think that AI is a tool, and we humans need to use that tool to enhance or obtain the information we need and to think and evaluate it critically.” 

Yoshida directs students to practice common conversations with AI in advance of an upcoming lesson.

“I provide a very specific prompt, including what vocabulary should be practiced. For example, ‘I am a beginning Japanese student. I would like to practice a conversation I might have while shopping using the following words,’” said Yoshida. “Engaging in practice with AI helps reduce students' anxiety associated with errors or fear of judgment when providing incorrect responses. Consequently, my objective is to enhance students' confidence and improve their in-class performance by having them complete this type of AI activity before attending class. Additionally, this helps students improve their reading and writing skills. Oral conversations can be practiced in the classroom.”

AI as a Tool

Director of Library Services Jared Della Rocca agrees that AI can be used thoughtfully to inspire learning. “As generative AI tools have become more common, I’ve come to see them as something that can be incorporated thoughtfully into the research process rather than avoided entirely. Used appropriately, large language models [which are the basis of AI] can help students begin exploring a topic, generate potential keywords, or clarify unfamiliar concepts, which can make their initial research more effective.”

AI tools, he said, are better used as a starting point rather than as sources themselves. 

“Students still need to verify claims, locate peer-reviewed scholarship, and engage critically with the materials they find through library databases and other scholarly resources. In that sense, AI becomes another tool in the research workflow—useful for brainstorming and refining questions—but the core practices of academic research remain the same: evaluating sources, consulting credible scholarship, and building arguments grounded in evidence,” said Della Rocca. “Our goal is to help students understand both the possibilities and the limitations of these technologies so they can use them responsibly and effectively in their academic work.”

Some faculty members are participating in workshops presented by the Bennington Center for AI, which launched in March and is led by faculty member in Computer Science Darcy Otto. In addition to offering programs and classes in how AI works, AI ethics, and (once prerequisite coursework is complete) how to build an AI, the Center challenges faculty members to find ways to incorporate AI into their class work and to offer classes that consider AI in their fields. After a faculty workshop with the Center, Creighton is conceptualizing an assignment where students feed their original essay into an AI with the prompt to create a counterargument to which they would then respond. 

“That’s part of the education that we are responsible for,” said Creighton. “I like these exercises that use AI to complement the thinking, rather than replace it.”