Arrived
Artificial intelligence technology is already infiltrating classrooms
By Heather Kemp
That is part of ChatGPT’s response to the question “Should K-12 schools embrace AI?”
arnings and testimonials about artificial intelligence (AI) have taken over newsfeeds since the tool, an AI large language model (LLM) created by OpenAI, launched its free preview for users to interact with and submit feedback in November 2022. ChatGPT is meant to hold human-like conversations and can complete tasks such as writing emails, essays and code in addition to responding to questions and other functions.
Its introduction has caused concern among some in education who believe it will assist students in cheating on assignments and further their dependency on technology, in addition to other issues.
In December 2022, a Los Angeles Unified School District spokesperson told The Washington Post that the district had blocked access to OpenAI and ChatGPT on its networks and devices in an attempt to protect academic honesty as risks and benefits were assessed. A number of districts throughout the country took similar actions.
Others, however, like Anaheim Union High School District and Val Verde USD, have been embracing the use of AI by both students and staff long before ChatGPT went live.
Anaheim Union HSD offers an artificial intelligence career pathway for local learners while Val Verde USD is utilizing digital assistants to support teaching and classroom management.
Benefits of AI generated when asking ChatGPT include its ability to provide students with personalized learning; efficiency in performing administrative tasks; ease of access to tools that enhance teaching; and its capability to analyze large data sets to find trends around student performance, aiding teachers in making informed decisions, identifying students who are struggling and providing early intervention.
Another factor it weighed is the teacher-student relationship. “The role of teachers is essential in fostering student engagement, critical thinking, and social-emotional development. While AI can provide support, it should not replace the human connection between teachers and students. Balancing the use of AI with human interaction is crucial,” according to ChatGPT.
Many of the topics that ChatGPT listed in its reply are explored in the research brief “Balancing the Benefits and Risks of AI Large Language Models in K-12 Public Schools,” published by Virginia Commonwealth University’s Metropolitan Educational Research Consortium (MERC) in April.
Jesse Senechal, MERC’s director, noted that while historically there has been research conducted on AI and education, continuing research in the area is needed.
Recent research
“There are also models that do images and sound, but they’re trained on large bodies of data; with ChatGPT, it’s text,” Senechal said. “What they’re doing is pattern recognition, where they’re trained on the data, and then they’re able to respond to prompts and predict likely answers to that based on the patterns they’ve seen in the data.”
Senechal understands why districts banned the new technology from their systems but does not see that as a strong long-term strategy.
“I don’t think it’s going to be something that we can avoid. I think we need to take it head-on,” Senechal said. “I made a presentation about this brief to some local superintendents … saying, ‘We need to develop a strategy around this and a policy around this right now. starting next fall, we need to have a plan in place for this — otherwise we’re going to be reactive and we’re going to fall behind it quickly.’”
Whether or not school or district leaders are aware, AI is likely being used on campuses, Senechal added, “and so I think that initial questions of, ‘who’s doing it, how are they doing it and who’s doing it well?’ are important questions to ask at this point.
“At the same time, it’s brand new,” Senechal continued. “It’s hard to imagine where we’re going to be in a year or two years with this, but I think it’s better that we start to ask those questions now.”
Senechal suggested AI tools could potentially create space for more relationship- and community-building between teachers and students by automating portion of teachers’ administrative work.
Using LLMs for lesson planning can ultimately support inclusivity. LLMs may be used to personalize content such as writing prompts or classroom activities to match individual student needs and learning styles and have the potential to enhance accessibility for students with disabilities as well as craft adapted lessons for English learners. For delivery of instruction and assessments, LLMs can provide personalized tutoring and even identify potential biases in assessment practices, allowing for more individualized and equitable student learning experiences.
Research, writing and studying assistance, personalized learning, language learning and for self-advocacy are some ways students might use LLMs, according to the brief.
Concerns around the use of products like chatbots include weakened academic integrity, student privacy, the possibility of being fed inappropriate content, inherent biases from the data used to train the model, equity considerations, and teacher and student overreliance.
The taskforce may consider a multitude of issues related to AI and schools, including, but not limited to, how to positively integrate AI technology into the instructional setting; assessing the ethical, legal and privacy implications of AI usage in schools; identifying appropriate policies for the use of AI; as well as examining the potential risks of using it in schools.
“The AI Taskforce: Education in the age of artificial intelligence embodies our commitment to provide CSBA members with the resources for a high-quality education that harnesses the power of innovation while ensuring the well-being of students, supporting content mastery, and protecting the integrity of the teaching and learning process,” said CSBA CEO & Executive Director Vernon M. Billy.
To that end, Senechal, who is a former teacher, remarked, “we need to quickly rethink our media literacy and digital citizenship curriculum and help students understand how to navigate a world where AI technologies are being infused throughout.”
On student privacy, Senechal and his research partners acknowledged that teachers could unintentionally expose sensitive student information and advised potential users to “ensure that LLMs comply with relevant laws and regulations, such as Family Educational Rights and Privacy Act (FERPA) in the United States.”
LEAs should consider reviewing their current policies around topics like student privacy, academic integrity or use of technology and see how they can be revised to address these new technologies.
The MERC brief also provides policy considerations for local educational agencies and recommendations and guiding questions for educators and education leaders.
Recommendations include staying informed on advancements; adopting explicit policies regarding AI usage at the district-, school- and/or classroom-level; embracing chances to scaffold teachers’ work; providing professional development on LLMs to teachers; purposefully disconnecting when appropriate; and teaching students to appropriately use LLMs.
Microsoft and Google are both planning to, or already have, integrated AI into commonly used products like Word, Outlook and Docs.
“It’s an important time to have a conversation around how technology can support our aims as an educational system. How can it really enhance the work we do?” Senechal said.
—ChatGPT
“The incredible thing is that this technology, Merlyn Mind, is untethering our teachers from their desks and allowing them to be mobile in the classroom with their students,” McCormick said.
Teachers have a handheld remote control that is voice enabled and can direct the device, referred to as “Merlyn,” to play PowerPoint presentations, YouTube videos, add content to Google Classroom folders and more.
The district was the first in the U.S. to have the devices, according to McCormick.
Having virtual assistants in classrooms was an idea that the district’s technology integration specialist Phil Harding toyed with for a few years before finding the right fit. They started with 10 devices that were programmed with less than 20 commands and could respond to general questions. Incorporating input from teachers, the devices now have more than 160 commands in addition to responding to general questions about topics like the height of landmarks or comparing their heights (as opposed to more open-ended questions like ChatGPT can answer).
Unlike devices like the Amazon Alexa, Merlyn was built for use in a school environment. It doesn’t retain personal information or things like voices, Harding clarified.
Although it was designed to assist educators, Val Verde USD teachers have gotten students involved in interacting with Merlyn, with activities like competing against the device to solve math problems. This is something that students are comfortable with as they’ve grown up with AI around them and in their homes, Harding noted.
“Let’s face it, we have the future sitting in our classrooms right now and I feel a strong obligation to create an environment where our students can safely investigate and learn the power of this technology, and learn it in an ethical, responsible way,” McCormick said of their approach. “I almost would equate it to digital citizenship. As a school district, we issue all of our kids Chromebooks. To me, part of the responsibility of issuing and making technology available to our students is digital citizenship, responsible use and learning the basics about student privacy and not sharing things, not assuming somebody else’s identity and stuff like that.”
McCormick noted that he’s heard of teachers beginning to experiment more with AI tools like Curipod, an AI-driven site that creates lesson plans and PowerPoint presentations with interactive pieces built in.
—Jesse Senechal,
director, MERC
Khanmingo is rolling out slowly, and Atwater Elementary School District, Long Beach USD and Compton USD are among the 500 school districts around the nation that have partnered with Khan Academy to pilot the tool, according to a March article in EdSource.
McCormick added that his initial thoughts on ChatGPT and similar products was that the tools will require students to be good editors, be able to put information into their own voice and will need to develop the critical-thinking skills “to be able to confirm the information that has been created for them by the chatbot.”
Next year, the district is planning to conduct testing around using a chatbot to prompt seniors to complete their Free Application for Federal Student Aid.
Anaheim Union HSD Superintendent Michael Matsuda said he’s heard of students utilizing ChatGPT to help with college applications. In the Orange County district, students are being exposed to AI technology via Anaheim Union’s artificial intelligence career pathway.
Anaheim Union HSD Board President Brian O’Neal is proud of the district’s efforts, saying “Artificial intelligence is here to stay. … Therefore, the educational field needs to get in front of this phenomenon and start providing the future world leaders with an understanding of the basics, and hopefully more than that.
“Anaheim Union High School District is on the path of being in the forefront of this educational idea with its AI course at John F. Kennedy High School. The district is not providing a class of AI, but a pathway for its students to be able to finish high school and continue on in this field with a solid understanding of AI, and how it can be used positively in society,” O’Neal continued. “The students will understand that AI can be used as a detriment to society, therefore it is imperative to provide them with an understanding of ethical responsibility. We are now in our second year of offering AI, and our second year of a subject that will continue to evolve.”
MERC “Balancing the Benefits and Risks of AI Large Language Models in K-12 Public Schools” research brief: https://bit.ly/3o3JkwJ
U.S. Department of Education Office of Educational Technology report Artificial Intelligence and the Future of Teaching and Learning: Insights and Recommendations: https://bit.ly/43MaBDd
California Department of Education AI literacy webinar recap: https://bit.ly/3WIwaSM
“AUHSD Future Talks” podcast interview with aiEDU’s founder: https://apple.co/3W6QLzR
Merlyn Mind podcast interview with Val Verde USD’s Phil Harding: https://bit.ly/42OdIug
Initially, the district was charting a narrower focus for the pathway, centered on skills like programming, coding and robotics along with other basic knowledge about AI, but the emergence of generative AI has changed things, according to Matsuda.
“We are evolving in terms of our thinking and our approaches to AI. I think the short answer is initially it was a pathway. It still is a pathway at Kennedy High School, but it’s touching everything now,” Matsuda said. “You have to start in the K-12 space, there’s going to be new types of courses on generative AI. We’re developing some of that in terms of how a student can use these tools and create music, create videos, create a new way of looking at essays, for example. It’s really going to amplify the innovation that comes out of K-12. I think you need a very strong ethical piece in there. [We’re] working with a lot of our English teachers and even teachers in International Baccalaureate to bring in an ethical layer to any of these technologies or cutting-edge areas in STEM.”
The district has created a taskforce and partnered with the nonprofit aiEDU to further develop guidelines on AI education for K-12 as resources are lacking. They plan on hosting a national summit on AI next spring.
Just because the district is all-in on its endeavors with AI, doesn’t mean they don’t have any concerns. Matsuda fears that AI could amplify bullying, isolation and depression. He doesn’t think that trying to block use of AI will work, however.
He urged other school leaders to create taskforces consisting of teachers and other stakeholders to get ahead of the curve.
“I think that we need to create awareness opportunities for teachers and educators and admins and school board members,” Matsuda said. “To me, AI is going to undermine a lot of the need for traditional content, multiple-choice testing. The bigger challenge I think we need to be focused on is emotional intelligence, relational intelligence, and how to use generative AI to create and innovate so America can continue to lead the world in innovation.”
Overall, ChatGPT seems to agree with those who are working to find ways to responsibly introduce AI to K-12 learning.
“In conclusion, while AI holds potential benefits for K-12 education, it is important to approach its integration thoughtfully. Schools should carefully consider the specific applications, address ethical concerns, and ensure equitable access to technology,” ChatGPT cautioned. “Ultimately, AI should be seen as a tool to enhance teaching and learning, with teachers playing a central role in guiding and supporting students’ educational journeys.”