What do superintendents need to know about artificial intelligence?
SAN DIEGO — When it comes to artificial intelligence, the good news for superintendents is that most people have some idea of what it is at this point.
“Who isn’t aware of it?” asked Susan Enfield, who recently resigned as superintendent of Washoe County Public Schools in Nevada and previously led Highline Public Schools in Washington.
Indeed, when asked if they had experimented with AI in their school districts, nearly all attendees raised their hands during a packed morning session at the National Conference on Education held by AASA, The School Superintendents Association, on Feb. 16.
Despite general familiarity, however, technical knowledge shouldn’t be assumed for district leaders or others in the school community. For instance, it’s critical that any materials related to AI not be written in “techy talk” so they can be clearly understood, said Ann McMullan, project director for the Consortium for School Networking’s EmpowerED Superintendents Initiative.
To that end, CoSN, a nonprofit that promotes technological innovation in K-12, has released an array of AI resources to help superintendents stay ahead of the curve, including a one-page explainer that details definitions and guidelines to keep in mind as schools work with the emerging technology.
Top-of-mind for many leaders is ensuring that, alongside any awareness of AI that exists in school communities, stakeholders also understand the technology’s limitations. Superintendents must inform teachers, parents and others in the community not just what AI can do for learning, but also clarify its limitations and any misconceptions about how it can and will be used, said Matthew Friedman, superintendent of Quakertown Community School District in Pennsylvania.
That even includes concerns educators may have that they’ll be replaced wholesale by AI teachers. That won’t happen, said Glenn Robbins, superintendent of Brigantine Public Schools in New Jersey. But the technology’s growing presence in everyday life and its importance in students’ future career paths could mean educators who don’t adapt will be replaced by those who know how to use AI effectively, he said.
“We have a responsibility on how to teach them to use this properly,” said Robbins, adding that kids today will likely have a future teammate in the workplace that is a machine or virtual entity.
What does proper AI use look like?
AI — and specifically generative AI, in the context of the panel’s discussion — has clear limitations. This has been demonstrated in AI image generation where subjects have, for example, six or more fingers, or when AI-created texts contain fabrications when asked to produce an account of real events.
Because of this, district leaders must ensure staff, students, administrators, parents and the community understand these shortcomings, and that generative AI is a tool to glean ideas from rather than create a final product, said Friedman.
Robbins added that it’s also important to be aware that the free version of ChatGPT is typically about six months outdated in terms of the information that has been fed into it compared to the paid version. “Think about that for a second. When you type in ‘speaker of the house,’ who’s going to come up?”
Generative AI also has biases based on the information it has consumed, which can result in the technology producing “complete hallucinations,” said Robbins. AI hallucinations are defined by Google as “incorrect or misleading results that AI models generate.”
Robbins, in an example, cited an exercise he conducted with teachers to use ChatGPT to create a writing prompt for an English language arts class. He then had the educators paste the prompt back into ChatGPT and ask the AI tech to answer it.
“Then if you do it again and say, ‘What did we do in class last week while we were working on this,’ it will make up a story about what was going on in class — once again, a complete hallucination that has nothing to do with the class.”
Comparing current generative AI to early versions of search engines and Wikipedia, Robbins stressed that students and educators should understand that you have to do your own background research to verify the information you get.
“Anything created by and used by humans is going to be flawed,” Enfield said. “I think that’s the moral of Glenn’s anecdotes there.”
The limitations of generative AI, she continued, underscore the ongoing importance of critical thinking skills. “To delve deeper and not just accept an initial answer is going to be important,” Enfield said.
The question of ethics and academic honesty
The concerns around ethics and cheating when using generative AI aren’t entirely unfamiliar in the historical context of education. With AI, Robbins said, the question often comes back to the familiar, “Are you just cutting and pasting?”
“Has anyone had that yet, where they just cut and paste the answer … and they forgot to take the quotations out?” asked Robbins, referencing the quotation marks that are added by default when text is copied and pasted from ChatGPT.
For educators, this creates more of an impetus to get to know students and understand how to personalize learning. When you know your students, no matter the grade level or subject, you also get to know their work, Friedman said.
Enfield added that as a former high school English teacher, she knew what her kids’ work looked like and could tell if a student actually wrote a paper they submitted. Citing a 2023 Stanford University study, she also noted that students aren’t cheating any more now than they were before the advent of generative AI tools like ChatGPT.
Creating a culture of responsible AI use
When it comes to training educators to use these tools, Enfield said that first and foremost, superintendents must address the “fear factor” adults have — not just of learning something new, but of how students will use these tools.
Robbins echoed that sentiment, saying that teachers must also feel safe and supported in that they have legroom to experiment and fail. “You can roll this out, but if your teachers are petrified of you, they’re not gonna do it,” he said.
To alleviate those concerns, he suggested that district leaders should showcase how they’ve experimented with the technology in their own time.
Additionally, district leaders should build capacity from within by identifying “champions” of generative AI use — who may be thinking 10 steps ahead of others — among their teachers, administrators, students and community members, said Friedman.
To keep parents and the community informed of what’s going on in schools, he also recommends holding events like monthly parent academies to educate people, and to reiterate that district leaders are there if families are uncomfortable with something or have concerns. “I can’t fix anything I don’t know about,” Friedman said.
Ultimately, he said, superintendents have to look at this from a bird’s-eye view without trying to tackle everything at once, and that makes leaning on colleagues and broader networks all the more crucial.
“I know the common phrase in the superintendency is it’s a lonely job. But I can tell you that as you continue to build your network — and look around this room — there’s amazing resources all around you,” said Friedman. “Don’t feel like this is something that you have to go out and tackle on your own.”