Whether it’s with artificial intelligence-powered 1:1 tutoring services for students or a course-building aide for teachers, ed tech companies are increasingly embracing AI — especially since ChatGPT entered the scene in November 2022.
As education leaders and policymakers weigh the pros and cons of using AI, the Software and Information Industry Association on Tuesday released principles to guide companies when developing tools for the classroom.
The SIIA principles for ed tech companies advise:
- AI tools should address students, families and educators’ needs.
- AI tools must consider educational equity, inclusion and civil rights as crucial factors to fostering positive learning environments.
- AI tools must protect student data and privacy.
- AI technologies should be transparent so school communities understand how to use them.
- Ed tech companies should engage with schools and stakeholders to explain the risks and rewards of using these new technologies.
- Companies developing AI tools should adopt best practices for accountability, assurance and ethics.
- The ed tech industry, as a whole, should work with school communities on supporting AI literacy among educators and students.
These principles come as the ed tech industry shows signs of booming once again. Market.Us this month projected ed tech spending to skyrocket from $14.8 billion to $132.4 billion globally between 2022 and 2032.
“With AI being used by many teachers and educational institutions, we determined it was critical to work with the education technology industry to develop a set of principles to guide the future development and deployment of these innovative technologies,” said Chris Mohr, SIIA’s president, in a Tuesday statement.
The SIIA guidance reflects both the ongoing concerns and promise AI can spark when integrated into the classroom. School leaders and ed tech experts continue to warn about data privacy issues, biases, plagiarism risks and AI's potential to stunt students’ critical thinking skills.
Punya Mishra, associate dean of scholarship and innovation and a professor at Arizona State University’s Mary Lou Fulton Teachers College, said he views AI as a personal assistant rather than something to completely do the work for someone.
For instance, if a teacher wants to use generative AI to develop a math game for a lesson plan, they should ask ChatGPT for multiple examples instead of asking it to create just one game, Mishra said.
“Then you pick and choose which ones you like, which ones you don’t, what you’d like to edit,” said Mishra, who also serves on the executive council of the Society for Information Technology in Teacher Education. “Don’t take what it generates at face value.”
Saying he is wary of AI’s implicit biases, Mishra said guardrails are needed to address and prevent the possibility of AI providing racially biased feedback. That’s also why AI literacy — or, understanding how the technology works — is crucial, he added. “Part of AI literacy is making teachers aware of these biases,” he said.
Additional AI guidance
SIIA is just the latest organization across the private, industry and government sectors to release AI guidance for schools.
Code.org, the Consortium for School Networking, Digital Promise, the European EdTech Alliance and the Policy Analysis for California Education together released an AI guidance toolkit for schools this month as a part of the Teach AI initiative. Begun in 2023, TeachAI aims to bring together education leaders and technology experts to discuss the role of AI-assisted instruction in schools.
Even OpenAI, the company that launched ChatGPT, released a set of recommendations. Those suggestions, put out in August, detail ways for teachers to use generative AI in the classroom, along with examples of prompts for creating lesson plans. The OpenAI guide for teachers also offered warnings and disclaimers mirroring those of technology experts — including that the information ChatGPT provides is not always correct.
A few months earlier, the U.S. Department of Education’s Office of Educational Technology released its first report on AI. Ultimately, schools need to adopt a “humans in the loop” approach, the department said in the May report, meaning AI tools should not replace teachers. Instead, educators should be centered as the decision makers for how AI is used in their classrooms, the agency said.