construction paper artwork of a robot with a speech bubble, taking place on a laptop screen
AI Has
Arrived

Artificial intelligence technology is already infiltrating classrooms

By Heather Kemp

“The question of whether K-12 schools should embrace AI is a topic of ongoing debate among educators, policymakers and experts in the field. While there are potential benefits to integrating AI into educational settings, there are also important considerations to take into account.”

That is part of ChatGPT’s response to the question “Should K-12 schools embrace AI?”

W

arnings and testimonials about artificial intelligence (AI) have taken over newsfeeds since the tool, an AI large language model (LLM) created by OpenAI, launched its free preview for users to interact with and submit feedback in November 2022. ChatGPT is meant to hold human-like conversations and can complete tasks such as writing emails, essays and code in addition to responding to questions and other functions.

Its introduction has caused concern among some in education who believe it will assist students in cheating on assignments and further their dependency on technology, in addition to other issues.

In December 2022, a Los Angeles Unified School District spokesperson told The Washington Post that the district had blocked access to OpenAI and ChatGPT on its networks and devices in an attempt to protect academic honesty as risks and benefits were assessed. A number of districts throughout the country took similar actions.

Others, however, like Anaheim Union High School District and Val Verde USD, have been embracing the use of AI by both students and staff long before ChatGPT went live.

Anaheim Union HSD offers an artificial intelligence career pathway for local learners while Val Verde USD is utilizing digital assistants to support teaching and classroom management.

Benefits of AI generated when asking ChatGPT include its ability to provide students with personalized learning; efficiency in performing administrative tasks; ease of access to tools that enhance teaching; and its capability to analyze large data sets to find trends around student performance, aiding teachers in making informed decisions, identifying students who are struggling and providing early intervention.

Artificial intelligence is defined as “the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision making, and translation between languages,” according to Oxford Languages.
ChatGPT also highlighted ethical considerations such as issues with data privacy, security and algorithmic bias and equity — mentioning that unequal access to technology already exists. “Ensuring that AI is accessible to all students is crucial to avoid exacerbating the digital divide,” the chatbot said.

Another factor it weighed is the teacher-student relationship. “The role of teachers is essential in fostering student engagement, critical thinking, and social-emotional development. While AI can provide support, it should not replace the human connection between teachers and students. Balancing the use of AI with human interaction is crucial,” according to ChatGPT.

Many of the topics that ChatGPT listed in its reply are explored in the research brief “Balancing the Benefits and Risks of AI Large Language Models in K-12 Public Schools,” published by Virginia Commonwealth University’s Metropolitan Educational Research Consortium (MERC) in April.

Jesse Senechal, MERC’s director, noted that while historically there has been research conducted on AI and education, continuing research in the area is needed.

Recent research

Senechal predicts the use of chatbots in schools will grow exponentially in the future and encourages educators to familiarize themselves with various generative AI models.

“There are also models that do images and sound, but they’re trained on large bodies of data; with ChatGPT, it’s text,” Senechal said. “What they’re doing is pattern recognition, where they’re trained on the data, and then they’re able to respond to prompts and predict likely answers to that based on the patterns they’ve seen in the data.”

Senechal understands why districts banned the new technology from their systems but does not see that as a strong long-term strategy.

“I don’t think it’s going to be something that we can avoid. I think we need to take it head-on,” Senechal said. “I made a presentation about this brief to some local superintendents … saying, ‘We need to develop a strategy around this and a policy around this right now. starting next fall, we need to have a plan in place for this — otherwise we’re going to be reactive and we’re going to fall behind it quickly.’”

Whether or not school or district leaders are aware, AI is likely being used on campuses, Senechal added, “and so I think that initial questions of, ‘who’s doing it, how are they doing it and who’s doing it well?’ are important questions to ask at this point.

“At the same time, it’s brand new,” Senechal continued. “It’s hard to imagine where we’re going to be in a year or two years with this, but I think it’s better that we start to ask those questions now.”

line drawing of a robot
The brief defines AI and LLMs and explains how they work. It also examines the implications of AI LLMs on teaching and learning, as well as concerns around their use in public schools. For example, the tools could alleviate educator workload by automating various tasks and aiding in creating lesson plans, designing assignments, grading papers and more.

Senechal suggested AI tools could potentially create space for more relationship- and community-building between teachers and students by automating portion of teachers’ administrative work.

Using LLMs for lesson planning can ultimately support inclusivity. LLMs may be used to personalize content such as writing prompts or classroom activities to match individual student needs and learning styles and have the potential to enhance accessibility for students with disabilities as well as craft adapted lessons for English learners. For delivery of instruction and assessments, LLMs can provide personalized tutoring and even identify potential biases in assessment practices, allowing for more individualized and equitable student learning experiences.

Research, writing and studying assistance, personalized learning, language learning and for self-advocacy are some ways students might use LLMs, according to the brief.

Concerns around the use of products like chatbots include weakened academic integrity, student privacy, the possibility of being fed inappropriate content, inherent biases from the data used to train the model, equity considerations, and teacher and student overreliance.

construction paper artwork of a robot with a speech bubble, taking place on a laptop screen
In June, CSBA announced a pioneering initiative, the AI Taskforce: Education in the age of artificial intelligence, aimed at equipping boards of education and staff with the necessary knowledge and tools to navigate the complexities of AI integration in public schools.
The taskforce will comprise a multidisciplinary team of technology experts, local educational agency members, educators, school administrators, researchers, academics, and other school and community partners. This diverse group will collaborate to provide guidance, promising practices, policies and legislative recommendations that address the challenges and opportunities posed by AI in our schools and society. By harnessing the potential of AI technology in education, CSBA aims to enhance learning outcomes and prepare students for the demands of the future workforce, all while addressing ethical concerns and guarding against the potential for abuse of AI in public schools.

The taskforce may consider a multitude of issues related to AI and schools, including, but not limited to, how to positively integrate AI technology into the instructional setting; assessing the ethical, legal and privacy implications of AI usage in schools; identifying appropriate policies for the use of AI; as well as examining the potential risks of using it in schools.

“The AI Taskforce: Education in the age of artificial intelligence embodies our commitment to provide CSBA members with the resources for a high-quality education that harnesses the power of innovation while ensuring the well-being of students, supporting content mastery, and protecting the integrity of the teaching and learning process,” said CSBA CEO & Executive Director Vernon M. Billy.

“While LLMs can certainly be used as a supplemental teaching tool, claims have been made that these technologies will create student dependency on technology, which will affect skill development, lead to an erosion of critical-thinking skills and a diminished ability to analyze information,” the brief states. “While LLMs can help generate content and answer questions, they may not always provide accurate or contextually appropriate information, which can be detrimental to the learning process.”

To that end, Senechal, who is a former teacher, remarked, “we need to quickly rethink our media literacy and digital citizenship curriculum and help students understand how to navigate a world where AI technologies are being infused throughout.”

On student privacy, Senechal and his research partners acknowledged that teachers could unintentionally expose sensitive student information and advised potential users to “ensure that LLMs comply with relevant laws and regulations, such as Family Educational Rights and Privacy Act (FERPA) in the United States.”

LEAs should consider reviewing their current policies around topics like student privacy, academic integrity or use of technology and see how they can be revised to address these new technologies.

In the coming months, CSBA anticipates releasing policy updates around homework/makeup work and academic honesty as it relates to AI.
“I think there needs to be the development of new policies and clear policy guidance around this,” Senechal said. “And I’m not exactly sure what that looks like at this point, but I think those conversations need to happen and [policies and guidance] developed quickly. I think the technology’s going to change quickly and get more powerful quickly. And so [education leaders] need to revisit this on a regular basis.”

The MERC brief also provides policy considerations for local educational agencies and recommendations and guiding questions for educators and education leaders.

Recommendations include staying informed on advancements; adopting explicit policies regarding AI usage at the district-, school- and/or classroom-level; embracing chances to scaffold teachers’ work; providing professional development on LLMs to teachers; purposefully disconnecting when appropriate; and teaching students to appropriately use LLMs.

Microsoft and Google are both planning to, or already have, integrated AI into commonly used products like Word, Outlook and Docs.

“It’s an important time to have a conversation around how technology can support our aims as an educational system. How can it really enhance the work we do?” Senechal said.

“While AI can provide support, it should not replace the human connection between teachers and students. Balancing the use of AI with human interaction is crucial.”

—ChatGPT

line drawing of a mechanical arm
AI in action
Riverside County’s Val Verde USD has been using virtual assistant devices since roughly 2019 and plans to expand availability to all 950 classrooms in the near future, according to Superintendent Michael McCormick.

“The incredible thing is that this technology, Merlyn Mind, is untethering our teachers from their desks and allowing them to be mobile in the classroom with their students,” McCormick said.

Teachers have a handheld remote control that is voice enabled and can direct the device, referred to as “Merlyn,” to play PowerPoint presentations, YouTube videos, add content to Google Classroom folders and more.

The district was the first in the U.S. to have the devices, according to McCormick.

Having virtual assistants in classrooms was an idea that the district’s technology integration specialist Phil Harding toyed with for a few years before finding the right fit. They started with 10 devices that were programmed with less than 20 commands and could respond to general questions. Incorporating input from teachers, the devices now have more than 160 commands in addition to responding to general questions about topics like the height of landmarks or comparing their heights (as opposed to more open-ended questions like ChatGPT can answer).

Unlike devices like the Amazon Alexa, Merlyn was built for use in a school environment. It doesn’t retain personal information or things like voices, Harding clarified.

Although it was designed to assist educators, Val Verde USD teachers have gotten students involved in interacting with Merlyn, with activities like competing against the device to solve math problems. This is something that students are comfortable with as they’ve grown up with AI around them and in their homes, Harding noted.

“Let’s face it, we have the future sitting in our classrooms right now and I feel a strong obligation to create an environment where our students can safely investigate and learn the power of this technology, and learn it in an ethical, responsible way,” McCormick said of their approach. “I almost would equate it to digital citizenship. As a school district, we issue all of our kids Chromebooks. To me, part of the responsibility of issuing and making technology available to our students is digital citizenship, responsible use and learning the basics about student privacy and not sharing things, not assuming somebody else’s identity and stuff like that.”

McCormick noted that he’s heard of teachers beginning to experiment more with AI tools like Curipod, an AI-driven site that creates lesson plans and PowerPoint presentations with interactive pieces built in.

“I don’t think it’s going to be something that we can avoid. I think we need to take it head-on.”

—Jesse Senechal,
director, MERC

line drawing of a head with technology for a brain
Khan Academy, a nonprofit creating online learning to make education free and accessible for all, is developing Khanmingo, an “AI-powered guide. Tutor for learners. Assistant for teachers,” according to its website.

Khanmingo is rolling out slowly, and Atwater Elementary School District, Long Beach USD and Compton USD are among the 500 school districts around the nation that have partnered with Khan Academy to pilot the tool, according to a March article in EdSource.

McCormick added that his initial thoughts on ChatGPT and similar products was that the tools will require students to be good editors, be able to put information into their own voice and will need to develop the critical-thinking skills “to be able to confirm the information that has been created for them by the chatbot.”

Next year, the district is planning to conduct testing around using a chatbot to prompt seniors to complete their Free Application for Federal Student Aid.

Anaheim Union HSD Superintendent Michael Matsuda said he’s heard of students utilizing ChatGPT to help with college applications. In the Orange County district, students are being exposed to AI technology via Anaheim Union’s artificial intelligence career pathway.

Anaheim Union HSD Board President Brian O’Neal is proud of the district’s efforts, saying “Artificial intelligence is here to stay. … Therefore, the educational field needs to get in front of this phenomenon and start providing the future world leaders with an understanding of the basics, and hopefully more than that.

“Anaheim Union High School District is on the path of being in the forefront of this educational idea with its AI course at John F. Kennedy High School. The district is not providing a class of AI, but a pathway for its students to be able to finish high school and continue on in this field with a solid understanding of AI, and how it can be used positively in society,” O’Neal continued. “The students will understand that AI can be used as a detriment to society, therefore it is imperative to provide them with an understanding of ethical responsibility. We are now in our second year of offering AI, and our second year of a subject that will continue to evolve.”

Resources

MERC “Balancing the Benefits and Risks of AI Large Language Models in K-12 Public Schools” research brief: https://bit.ly/3o3JkwJ

U.S. Department of Education Office of Educational Technology report Artificial Intelligence and the Future of Teaching and Learning: Insights and Recommendations: https://bit.ly/43MaBDd

California Department of Education AI literacy webinar recap: https://bit.ly/3WIwaSM

“AUHSD Future Talks” podcast interview with aiEDU’s founder: https://apple.co/3W6QLzR

Merlyn Mind podcast interview with Val Verde USD’s Phil Harding: https://bit.ly/42OdIug

Matsuda explained that career pathways (the district also has pathways for cybersecurity and biotechnology, among other fields) involve collaboration with institutions of higher education and corporate, business and nonprofit partners. Those relationships help district leaders identify the kinds of skills employers want and give students dual enrollment opportunities that align with their interests. Hundreds of students participate in the pathway, which starts in seventh grade.

Initially, the district was charting a narrower focus for the pathway, centered on skills like programming, coding and robotics along with other basic knowledge about AI, but the emergence of generative AI has changed things, according to Matsuda.

“We are evolving in terms of our thinking and our approaches to AI. I think the short answer is initially it was a pathway. It still is a pathway at Kennedy High School, but it’s touching everything now,” Matsuda said. “You have to start in the K-12 space, there’s going to be new types of courses on generative AI. We’re developing some of that in terms of how a student can use these tools and create music, create videos, create a new way of looking at essays, for example. It’s really going to amplify the innovation that comes out of K-12. I think you need a very strong ethical piece in there. [We’re] working with a lot of our English teachers and even teachers in International Baccalaureate to bring in an ethical layer to any of these technologies or cutting-edge areas in STEM.”

The district has created a taskforce and partnered with the nonprofit aiEDU to further develop guidelines on AI education for K-12 as resources are lacking. They plan on hosting a national summit on AI next spring.

Just because the district is all-in on its endeavors with AI, doesn’t mean they don’t have any concerns. Matsuda fears that AI could amplify bullying, isolation and depression. He doesn’t think that trying to block use of AI will work, however.

He urged other school leaders to create taskforces consisting of teachers and other stakeholders to get ahead of the curve.

“I think that we need to create awareness opportunities for teachers and educators and admins and school board members,” Matsuda said. “To me, AI is going to undermine a lot of the need for traditional content, multiple-choice testing. The bigger challenge I think we need to be focused on is emotional intelligence, relational intelligence, and how to use generative AI to create and innovate so America can continue to lead the world in innovation.”

Overall, ChatGPT seems to agree with those who are working to find ways to responsibly introduce AI to K-12 learning.

“In conclusion, while AI holds potential benefits for K-12 education, it is important to approach its integration thoughtfully. Schools should carefully consider the specific applications, address ethical concerns, and ensure equitable access to technology,” ChatGPT cautioned. “Ultimately, AI should be seen as a tool to enhance teaching and learning, with teachers playing a central role in guiding and supporting students’ educational journeys.”

Heather Kemp is a staff writer for California Schools.