Special Report
Classroom Technology From Our Research Center

Schools Are Taking Too Long to Craft AI Policy. Why That’s a Problem

By Alyson Klein — February 19, 2024 8 min read
A person sits at a computer and tries to understand the AI policy confusion
  • Save to favorites
  • Print

It’s been more than a year since ChatGPT’s ability to produce astonishingly humanlike writing sparked fundamental questions about the role of artificial intelligence in K-12 education.

Yet most school districts are still stuck in neutral, trying to figure out the way forward on issues such as plagiarism, data privacy, and ethical use of AI by students and educators.

More than three-quarters—79 percent—of educators say their districts still do not have clear policies on the use of artificial intelligence tools, according to an EdWeek Research Center survey of 924 educators conducted in November and December.

District leaders want to help schools chart the right course on the potentially game-changing technology, but many feel “overwhelmed and overloaded,” said Bree Dusseault, a principal at and the managing director for the Center for Reinventing Public Education, a research organization at Arizona State University’s Mary Lou Fulton Teacher’s College, who has studied AI policymaking.

The consequences of us ignoring it and sticking our heads in the sand is that students will game the system.

The lack of clear direction is especially problematic given that the majority of educators surveyed—56 percent—expect the use of AI tools to increase in their districts over the next year, according to the EdWeek Research Center survey.

And while experts are encouraging schools to teach their students to use AI appropriately, banning the tools for students is still a relatively common practice in K-12 education, the survey found.

One in 5 educators surveyed said that their district prohibits students from using generative AI, such as ChatGPT, although teachers are permitted to use it. Another 7 percent of educators said the tools were banned for everyone—including staff.

When district officials—and school principals—sidestep big questions about the proper use of AI, they are inviting confusion and inequity, said Pat Yongpradit, the chief academic officer for Code.Org and leader of Teach AI, an initiative aimed at helping K-12 schools use AI technology effectively.

“You can have, in the same school, a teacher allowing their 10th grade English class to use ChatGPT freely and getting into AI ethics issues and really preparing their students for a future in which AI will be part of any industry,” Yongpradit said. “And then literally, right down the hall, you can have another teacher banning it totally, going back to pencil and paper writing because they don’t trust their kids to not use ChatGPT. Same school, different 10th grade English class.”

The new “digital divide will be an AI divide,” Yongpradit said.

‘Policy is always behind technology’

It’s not hard to understand why most district leaders aren’t eager to make big decisions about how their schools will use the technology.

Many educators worry that if students are exposed to generative AI, they’ll employ it to cheat on assignments. Plus, AI tools can spit out false information and magnify racial and socioeconomic biases. AI also develops—some would say “gets smarter”—by consuming data, opening the doors for potential student-data-privacy nightmares.

The vast majority of educators don’t have the capacity to cope with those complications on top of their other responsibilities, the survey found.

More than three quarters—78 percent—of educators surveyed said they don’t have the time or bandwidth to teach students how to think about or use AI because they are tied up with academic challenges, social-emotional-learning, safety considerations, and other higher priorities.

What’s more, AI is changing so rapidly that any policy a district or state crafts could be outdated the moment it is released.

That’s typical when it comes to new technologies, said Kristina Ishmael, who until late last year served as the deputy director of the U.S. Department of Education’s office of educational technology.

“Policy is always behind technology,” said Ishmael, who is now a strategic advisor at Ishmael Consulting. In some cases, that’s “very intentional, because it’s policy; once you put it in, it’s hard to take it off.”

But AI requires a shift in thinking, she pointed out.

See Also

A close up of a laptop and hands overlaid with AI and techie icons.
iStock/Getty
Classroom Technology Schools Desperately Need Guidance on AI. Who Will Step Up?
Alyson Klein, November 1, 2023
2 min read

AI policy and guidance need to be “living, breathing documents, because the technology is changing so quickly,” Ishmael said. “It’s not something like a continuous improvement plan where your school is looked at every couple of years, then the binder sits on the shelf.”

Another stumbling block, she said: Some district leaders are hitting the pause button to see if their state or Washington policymakers establish AI parameters. The federal Education Department has pledged to release resources, including an AI policy tool kit this year. Members of Congress have introduced legislation on select AI issues, such as literacy.

But it’s not clear if more significant action is on the horizon, Ishmael said.

“Folks are waiting to see what happens at the federal level,” said Ishmael. But she recommends districts avoid delay.

“I’d tell them to start working on things now,” she said. “This is a brand-new tool that is impacting our classrooms and our lives. There needs to be some sort of baseline parameters for students to be able to use [it].”

‘We’re all entering this innovative environment with a lot of unknowns’

Most educators see value in understanding AI.

Two-thirds of those surveyed by the EdWeek Research Center say students will need knowledge of AI because the technology already features so heavily in the products and services that are part of their daily lives. And another 60 percent say that employers are looking for people who can work with AI tools to do their jobs more efficiently.

Nearly half said students will need AI skills to be successful in college, and nearly a third believe younger students will need them to do well academically in the upper grades.

That’s motivated some district leaders to move quickly.

“I think the driver for me is really looking at the jobs of the future and looking at it through the economic lens,” said Jerry Almendarez, the superintendent of the Santa Ana Unified school district, which he describes as a largely “blue collar” Southern California community. “I see this as a window of opportunity for communities like mine, to catch up to the rest of society by giving [students] skills and access to a technology that has never been at their fingertips before,” he said.

District and school leaders who want to help their students navigate this technology “should know that they’re not alone, if they don’t know where to start,” Almendarez said. “That’s OK. None of us really do. We’re all entering this innovative environment with a lot of unknowns.”

Almendarez suggested districts turn to entities that have already sketched out what AI policy guidance could look like. That includes six states—California, North Carolina, Oregon, Virginia, Washington, and West Virginia—as well as districts that were early out of the gate on AI policy, such as Washington state’s Peninsula school district near Seattle.

Nonprofit organizations have also stepped up. TeachAI last fall released a guidance tool kit that offers suggestions for mitigating privacy risks of AI tools, tactics for ensuring students use the technology to inform their assignments and not to cheat, and tips on how to train staff on using AI appropriately.

See Also

Last fall, the Council of the Great City Schools and the Consortium for School Networking released a 93-question checklist to help educators think through policies around generative AI. The list includes queries such as: Does your district have a dedicated point person on the role of artificial intelligence in K-12 education? Are you requiring vendors that use AI algorithms in their products to ensure they are free from bias?

That kind of direction is what district leaders are searching for, said Dusseault of the Center for Reinventing Public Education.

“We’ve heard superintendents say, ‘I would like to see support, and it doesn’t have to come from my state. It could come from a trusted nonprofit,’” she said.

Some districts are taking it a step further. New York City, which reversed an initial ban on ChatGPT, and Santa Ana are launching AI policy shops whose work can inform the broader field.

‘What really is the purpose of having kids take world literature or biology and physics?’

Much of the early discussion around the use of generative AI in K-12 classrooms centered on how students might use the technology to cheat, Dusseault said.

“One of the big questions that I know out the gate was kind of scary and put some of the districts on their back foot was plagiarism, this idea that ChatGPT is going to end up giving students the ability to plagiarize and not represent their work,” Dusseault said.

But district and state leaders’ thinking has evolved over the past year, she said.

“Now, a year later, we’re seeing: ‘We are probably going to all be using some large language model or something like ChatGPT into the future, so students may need to actually have skill building on how to use it appropriately.’”

One state on the vanguard of this approach: North Carolina, whose AI guidance, released last month, includes a clear outline of possibilities for using AI on assignments without the technology encouraging cheating or plagiarism.

As generative AI gets ever more adept at the kinds of assignments teachers regularly give students—write an essay on bird imagery in Shakespeare’s “MacBeth,” explain the differences between igneous and metamorphic rocks—educators will need to rethink long-held tenants of teaching and learning, said Catherine Truitt, North Carolina’s superintendent of public instruction.

They will have to ask themselves: “What really is the purpose of having kids take world literature or biology and physics. What should kids be getting out of these courses?” she said.

Educators “are going to have to start having hard conversations” about what it really means to teach content or help students develop critical-thinking and analytical skills, she said. “The consequences of us ignoring it and sticking our heads in the sand is that students will game the system.”

education week logo subbrand logo RC RGB

Data analysis for this article was provided by the EdWeek Research Center. Learn more about the center’s work.

Coverage of education technology is supported in part by a grant from the Siegel Family Endowment, at www.siegelendowment.org. Education Week retains sole editorial control over the content of this coverage.

Events

Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and other jobs in K-12 education at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
What is it About Math? Making Math Figure-Out-Able
Join Pam Harris for an engaging session challenging how we approach math, resulting in real world math that is “figure-out-able” for anyone.
Content provided by hand2mind
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Webinar
Science of Reading: Emphasis on Language Comprehension
Dive into language comprehension through a breakdown of the Science of Reading with an interactive demonstration.
Content provided by Be GLAD

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Classroom Technology Q&A Why Teachers Should Be Front and Center in Discussions About AI in the Classroom
A new study explores how teachers make ethical judgments about using AI for instruction.
4 min read
 Artificial intelligence hand touching on screen then question mark symbols appears. Concept questioning, ethical.
iStock/Getty
Classroom Technology 8 Tech Skills Every Student Should Have, According to Educators
Today's students are "digital natives," but some of their basic tech skills still need improvement, educators say.
3 min read
A student and teacher navigate the pitfalls of a digital world.
Vanessa Solis/Education Week + Ianna Rallonza/Sketchify via Canva
Classroom Technology How Taylor Swift's Experience With 'Deepfakes' Can Help Students Examine AI Ethics
Teachers shouldn't wait until the next big news event to teach students about deepfakes, experts say.
6 min read
Custom illustration by Taylor Callery showing a glitchy rendition of Taylor Swift split with a collage of pixelated non recognizable images which show the idea of a "deep fake' version of Taylor Swift while a young female is shown in the background holding a phone and looking over her shoulder at T Swift in the background. T Swift is breaking apart with subtle use of pixels.
Taylor Callery for Education Week (Image of Taylor Swift: AP)
Classroom Technology State Outlines Guidance for Different Levels of AI Use in Classrooms
How to use AI in a variety of ways without encouraging cheating or plagiarism.
3 min read
AI chatbot - Artificial Intelligence digital concept
E+ / Getty