The Rockwood Technology department has updated its artificial intelligence guidelines for students’ use to include Google Gemini and NotebookLM, two AI tools that introduce a monitored AI experience to students.
Both Gemini and NotebookLM also have protections that keep data from being used to train the AI model and that data is not reviewed by Google, Chief Information Officer Bob Deneau said.
Deneau said ChatGPT and Gemini are similar in that they both use large language models (LLMs), but they differ in how they handle user data. ChatGPT uses the data it collects to train its model, whereas Gemini does not.
Gemini also has LearnM, a guardrail that can stop certain AI prompts based on the topic, Deneau said. Certain Rockwood Technology Department staff can review a student’s Gemini sessions if there are concerns.
“This level of reporting provides us with additional tools to combat misuse,” Deneau said.
Previously, AI chat bots used by students had no regulations or rules to ensure they were used with academic honesty.
“AI is increasingly becoming a part of the work force and higher education and we want to ensure that Rockwood students start to build positive AI skills so they are prepared for whatever their next step will be after high school,” Deneau said. “As we learn more, get more feedback, etc., we will continue to examine our guidance for students and staff to make updates when appropriate.”
Lauren Williams, language arts teacher, hasn’t used Gemini or NotebookLM yet, but said she prefers Skill Struck in her classroom because she likes how it won’t engage in conversations about mental health topics or use inappropriate language.
Williams said she can also see students’ chat history on Skill Struck, which helps her talk to students about what acceptable AI use looks like in her classroom.
Williams said high school students should use AI to help revise and edit their writing rather than having it do the thinking for them.
“You have to know how to write without AI to be able to use AI effectively, but students are doing it in the other order, and then they don’t know how to write without it and that is where they get into some trouble,” Williams said.
Williams also said detecting AI is time consuming, especially for language arts teachers.
“One student using AI could be an hour of my time,” Williams said. “And if there are five students that use AI on one paper, that’s five hours of phone calls and discussions and referrals.”
Williams said AI can be a tool to help students make study guides and practice questions. However, using AI to complete assignments without having discussions with adults about ethics and the reason for using AI can create a harmful habit.
Now, Williams sees students use AI to cheat on simple assignments such as class introductions.
“So it’s that default of ‘I’m just going to go to AI first’ that’s a problem,” Williams said. “So I think some teachers are hesitant just because you open the floodgates and are afraid of what’s going to happen.”
Williams said building AI skills in middle school and setting clear standards for AI use can help students.
“I think teachers just need to be more upfront with what is acceptable and what’s not, because I think the not talking about it sometimes it gets misused,” Williams said.
Everleigh Foulk, sophomore, said she is aware new AI resources are available to her, but her teachers haven’t brought them up in class. She said she uses AI to ask questions about personal situations.
Foulk said students should not depend on AI to do their work for them.
“If it’s just to correct grammar, and usually in foreign language classes they’ll have us use it to check spelling mistakes, but as long as you’re learning from that I think it’s OK,” Foulk said.
With the growth of AI, Foulk said she is worried about her interest in a writing career.
“I worry that people will want to read that [AI writing] more than stuff that actual people write,” Foulk said.
Additional reporting by Siddharth Siwant
