As artificial intelligence grows in popularity, it can be overwhelming for school districts to know where to start or how to properly implement this new, advanced technology in classrooms and school administrative settings.
While the exact process can vary by district, K-12 technology leaders stress the importance of having an evaluation system in place for vetting AI tools — especially when it comes to protecting student data and mitigating biases.
Here are four pieces of advice that district tech administrators recommend schools keep in mind when considering new AI apps and platforms.
Start small and don’t rush
Washington’s Peninsula School District is easing into AI implementation and currently permits the use of 11 AI tools, which range from more popular tools such as ChatGPT to those that are more specifically ed tech, said Kris Hagel, the district’s chief information officer.
Hagel said he wants to be more cautious when adopting AI tools, given that Peninsula School District already uses over 1,000 different ed tech tools. The district is planning to significantly clean up its approval system in the next year, he said.
“Ed tech tools before have kind of proliferated throughout the district,” Hagel said. That’s why he wants to ensure that, moving forward, any approved tools — including AI — will actually improve the district’s instruction or operations. There needs to be a plan, he said, rather than just taking on a new tool because someone “went to a conference” and “saw a shiny thing.”
For Andrew Fenstermaker, instructional technology coordinator at Iowa City Community School District, it’s important that districts “start small” and “go slow” when they begin the vetting process for AI tools. When using that “go slow” approach, districts should be hands-on and seek feedback from those using the AI tools.
“So if it is a product that's more catered towards the elementary classroom space, make sure to get some insights from classroom teachers that are in the elementary space,” Fenstermaker said.
Align options with district strategies
District leaders stress that any AI tools that are implemented should align with broader strategies, including instructional and operational goals.
At Iowa City Community School District, teachers must submit a form when they want to try a new AI tool or another ed tech product for curricular use, Fenstermaker said. Then, a coordinator in the district’s curriculum and instruction department reviews the request to see if the tool aligns with the instructional goals. One of the points that are considered in the evaluation is whether the product will provide data to help with teaching decisions.
If the curriculum coordinator approves the tool, it then goes to the technology department to examine for any data privacy concerns, Fenstermaker said.
District leaders shouldn’t create an entirely separate vetting process for AI tools if they already had one for adopting ed tech, particularly for data privacy and instructional alignment, Hagel said. Instead, he suggested, they should lean on the existing systems and add minor adjustments to include AI, “because everything is going to have AI in it, and if it doesn’t already, it will shortly.”
Patrick Gittisriboongul, assistant superintendent of technology and innovation at Lynwood Unified School District in California, suggested districts should also consider outcomes-based contracting for AI tools when possible.
Ensure student data is protected
Lynwood USD has an 18-point AI fact sheet for ed tech vendors to fill out before the district adopts a new tool, said Gittisriboongul. The fact sheet asks vendors to detail the purpose of the platform, the training data being used, the source of the training data, and the type of AI model the tool relies on.
Identifying the model type is “much more important now, because every foundational model is competing with each other,” Gittisriboongul said. “So being able to navigate between the models is something that, as IT directors and CTOs [chief technology officers], we want to know to make sure that we protect our organization and understand how some of those models have been trained and maintained and benchmarked.”
Still, Lynwood USD is having some difficulties with getting vendors to fill out the checklist, he said.
It’s crucial that there is transparency with district AI products, any related data sets, and the privacy terms articulated by an AI platform, Fenstermaker said. Not only will that allow districts to protect themselves from any future data breaches, he said, but the strategy will also ensure a district is keeping its own student data secure while using AI tools.
To help with bandwidth, Fenstermaker said, Iowa City Community School District often leans on the global nonprofit 1EdTech to help vet privacy policies for tools that have already been evaluated by the organization. By doing so, the district can easily spot a tool’s possible red flags in its data privacy policy while also streamlining the district’s process to ensure student data is protected, he said.
Watch out for AI biases
When it comes to any technology, but especially AI, Hagel said, “everything is programmed with some sort of biases, whether it’s the bias that already exists in society or it’s the bias of the people that are actually building the tool.”
The key issue for schools, Hagel said, is accounting for those biases and making sure districts get “the results that you want to get out of these tools, knowing that that exists and knowing that there are things you can do to compensate for that.”
One of the most important parts of Lynwood USD’s AI fact sheet is that it seeks and assesses information from a vendor about its products’ algorithmic impact, said Gittisriboongul. That includes questions about how the vendor ensures the AI tool is working correctly in school environments as well as how they fine-tune their systems and manage for bias.
When asking vendors about potential biases in an AI tool, Gittisriboongul said, Lynwood USD investigates how AI tools aim to treat students fairly and without prejudice. Additionally, the district looks for any third-party studies on AI tools that rate any issues of potential bias and the model’s level of fairness or accuracy. The district also tries to ensure there are ways to report any issues that staff or students may encounter with an AI tool, he said.
In Iowa City Community School District, administrators look for any third-party certifications an AI tool may have, Fenstermaker said. One example is nonprofit Digital Promise’s Responsibly Designed AI certification. To get the certification, the vendor must go through an evaluation process that examines if AI tools are inclusive and represent all demographics across populations, he said.