The machine in the classroom
Artificial Intelligence has spread across the digital landscape like ants in a lawn. Hubs scattered in plain site offer a glimpse of hidden networks powered by machine thinking. From backend tools guiding technology, to predictive text that helps write emails, to machine-made art, AI isn’t just the future, it’s now.
So, it may come as no surprise that AI is in public schools. As more employers use it, graduates must be competent to be competitive hires. It's not just about jobs. AI tools are also available to help educators and students. However, AI comes with risk. So how is it being used in local classrooms? What are the opportunities, and where are the pitfalls?
The machine in the classroom
Boothbay Region High School (BRHS) Principal Tricia Campbell said the school’s tech team is working to ensure students are exposed to emerging technologies they might use in their career or college. She also said there is an important balance to maintain academic integrity. “As we move into a future that does rely partially on AI, it’s going to be important that students understand the tools and their limitations,” she said.
Earlier this year, the school's technology team presented prospective guidelines to the School Committee that laid out a blueprint for how teachers should approach AI. The team noted that AI isn't used in younger grades, only the high school. However, the guidelineshave not been voted on by the board.
At BRHS, some teachers employ AI tools to help develop lesson plans and teaching aids,or change the reading difficulty of resources to match a student’s needs. AI tools can also help develop ideas for assignments, come up with questions for research projects, or come up with an essay prompt.
“I think about enhancing the creative aspects that students have to integrate. The research aspects, the presentation aspects, the sort of cross-disciplinary aspects," said Technology Integrator Stacy Gauthier. “That's the best use of generative AI, just in general. Not as a substitute for student work, but sort of kickstarting student writer's block, kick-starting student's creative process.”
However, Gauthier says AI is especially useful for students with special needs. Tools such as AI-powered voice typing have helped students with motor skills deficits and limited the need to hire a scribe, allowing students to learn more independently. “The teachers that are more willing to embrace some of the AI technology are teachers who I've had a lot of success with the special education department,” she said.
Science teacher Emily Higgins said she was an early adopter of AI. She uses it to help students dive deeper into project ideas or ask questions on a subject they don't know anything about. She said during a project on nuclear fusion, a tool called Perplexity can generate questions on the difference between fission and fusion, with sources to find an answer. Higgins can then guide students in a critical-thinking lesson on which are more legitimate than others.
“I think it's important for people to realize that it can be used as a useful tool while still having your own original thought,” she said.
However, Higgins said not all her peers are eager to use AI. Some have embraced it, some are hesitant, and others are opposed. According to her, teachers in humanities and creative subjects have been more reluctant. How AI is changing students’ critical-thinking development and creativity isn't fully understood, but it raises red flags.
“I'm worried about taking away the critical thinking skills and the problem-solving skills that we want our students to have to go out into the world with,” Manahan said. “I worry about that from a parent perspective. I worry about that from an educator’s perspective. I worry about it from a citizen perspective. If we're not supporting those skills, I'm not sure what our future looks like.”
Whose work is it?
For many, the elephant in the room is cheating. Critics say it is easy for students to have AI complete their assignments with a few clicks. However, Manahan said Google inspired similar concern, and some were worried students could simply look up answers. However, even though Google has rolled out several AI-powered features, a Google search isn't the same as AI.
“Since dawnof time, humans have found ways to cheat,” Manahan said. "You can still buy a full research paper online like when (I was) in college. There were CliffsNotes and SparkNotes andall different ways to sort of distill an assignment down to make it easier. It's just how you define cheating.”
As tempting as it could be to ban AI to combat cheating, Gauthier argues that students should learn how to use it or be left in the dust. “If we keep blocking it, if we keep shutting it down, the kids are never going to get that experience,” she said.
Instead, Manahan said the school aims to make rules that allow ethical use of AI balanced with educational responsibilities. She said the school wants to limit “gotcha” scenarios and turn inappropriate AI use into teaching moments. If a student is caught misusing AI, she said teachers should determine why the student felt it was necessary and help model how AI could have been applied appropriately.
According to Higgins, students cheat for several reasons. She said they may not understand what is being asked of them or they don't have time and panic. As a teacher and a parent, she said she wants to foster a love for learning rather than take shortcuts.
However, in just the past few years, students could go from researching parts of an essay to having AI write its entirety in a style of their choice. In addition, AI detection tools are unreliable, according to Gauthier. She said the tech team encourages training students by modeling appropriate uses and designing assignments that can't be completed wholly with AI. Another approach is to adjust the assignments for higher-level problems to elevate the challenge and encourage critical thinking.
“If you focus more on process and if you make your prompts, what you're asking students to do, more unique, then it requires them to do a little bit more thinking outside of what the AI could generate for them,” Gauthier said.
Mahanan said she demonstrates where technology can fail by creating scenarios where AI can’t be trusted or can't complete the task. That way, students can also learn critical-thinking skills about AI's limits and how best to use it as a tool. Manahan said the tech team is trying to get teachers comfortable with the technology, but not too comfortable.
“There are reasons to be cautious. Extra, extra cautious,” she said. “It’s an important part of a teacher's responsibility to educate on appropriate use, ethical use of technology.
Cheating isn't the only concern around AI. The effects of AI, social media, and other modern technology on young minds aren't fully understood by the health and science community. Furthermore, AI presents some security risks. To train AI, developers mine information from the internet and users, and companies are not always open about what they take and how it is used. ChatGPT, the most well-known AI, is prohibited at the school due to such security concerns, according to Manahan.
“It's the wild west,” Manahan said. "Tech companies get to do whatever they want, and all they are concerned about is the bottom line. They're not thinking about the user. They're not thinking about the long-term impacts of what it means to collect likes and look for affirmation.”
The kids are alright
Despite concerns, BRHS educators say students have a balanced view of AI. According to a survey of around 120 BRHS students, most were comfortable using AI, especially those that help them write, edit and research. Gauthier said she thinks students are more speculative than they once were, especially when it comes to evaluating information, bias and credible sources.
In the survey, most students agreed that using ChatGPT to write an assignment was cheating, should never be allowed in school, or only be allowed in certain situations. In addition, there were mixed opinions on using AI to formalize a draft, but most agreed that using it for research, finding topic ideas and fixing grammar was not cheating.
“I'm glad to see that kids are thinking about this through the lens of, ‘I don't know, it could be good, but what could some of the negative aspects be?’” Gauthier said. “I think that just reflects maturity on the part of today's student. As future citizens of the world, I think they feel like that about a lot of things ... they are a little bit more cautious toweigh out the pros and the cons.
Manahan said her own child, 13, isn't sure about AI. She said he understands it, but is unsure about how to apply it. “I think they're looking for guidance from the grown-ups to some extent. They want boundaries, I think ...” she said.
On a larger scale, Manahan said it's her job to keep an eye on the technology landscape at BRHS and other schools to see how tech could affect children. She acknowledged it is a difficult time to be a parent as technology seems to be dominating kids’ attention. She said parents are facing a lot of uncertainty around how to keep children safe, but there is power in talking about it.
"There (is) a community conversation that needs to be had around technology use with our kids,” she said. “We don't have all the answers, but some of us have insights into bigger conversations that I think can bring some clarity, but also some reassurances. That's what I want to do with community. I want to acknowledge the elephant in the room and pull the curtain back and be like, ‘Okay. This is what it is. And we're all in this together. And we're going to figure it out together.”