Dive Brief:
- Teachers are increasingly using generative artificial intelligence tools to support students with disabilities in ways that save time for educators and provide best practices for interventions and clear communication for students and parents, according to a new paper from the Center for Democracy and Technology.
- CDT however, warns of risks in using AI to craft individualized education programs, including potential violations of the Individuals with Disabilities Education Act and privacy laws, as well as possible introduction of inaccuracies and biases.
- In addition, teachers should be cautious about entering identifiable information of students into AI tools, especially when using ones not vetted and approved by their school system, the center said.
Dive Insight:
Nearly 60% of special education teachers reported using AI to develop an IEP or Section 504 plan during the 2024-25 school year. That's an 18-percentage-point increase from the previous school year, according to polling by CDT.
This use of AI is a good idea, according to 64% of parents of students with an IEP or Section 504 plan and 63% of students with either plan, the polling found.
Furthermore, about a third of special educators said they used AI to perform specific tasks during the 2024-25 school year, including:
- Identifying trends in student progress and helping determine patterns for goal setting.
- Summarizing the content of IEP and 504 plans.
- Choosing specific accommodations while creating IEP or 504 plans.
Fewer special educators said they used AI to write only the narrative portion of an IEP or 504 plan (21%), or to fully write either plan (15%).
Saving time was the main benefit for special educators using AI, according to CDT. The center cited research showing that teachers who use AI tools weekly may save up to six weeks over a school year.
That's a significant time savings, considering some school systems are reporting severe special educator shortages and burnout, CDT said.
"Teachers and administrators are interested in tools that save even a small amount of time per IEP, including those that use AI," the paper said.
But along with that potential comes certain risks, specifically legal and privacy liabilities, CDT said.
For instance, IDEA requires each IEP to be unique and tailored to each students’ disabilities, goals and process for achieving their goals. An AI tool that develops IEPs based on little student-specific information and that is not significantly reviewed and edited by a teacher likely would not meet these IDEA requirements, said the CDT paper.
Educators and school systems should also be aware of privacy rules under the Family Educational Rights and Privacy Act, IDEA and other state-level privacy policies when using AI tools, CDT said. Any student information included in a query to a chatbot can be collected and likely stored by the chatbot company, the center said.
The privacy risks and chance of violating FERPA vary depending on factors like the chatbot version being used and whether the school or district has agreements with vendors that license purpose-built tools, which may have more privacy protections.
To support special educators as AI becomes more embedded in the field, CDT recommends school and district administrators provide guidance and training about how to responsibly and ethically use AI to develop IEPs, including how to adhere to legal requirements. The center also suggests school and district administrators establish communities of practice so teachers can discuss best practices in using AI to develop IEPs.
Moreover, schools should discuss with parents and students how AI is incorporated into the IEP development process so they can share any concerns and suggestions, the center said.
For the survey, CDT polled 275 licensed special education teachers, 394 students with an IEP or 504 plan and 336 parents of a child with an IEP or 504 plan. CDT's online surveys of nationally representative samples included 1,030 grade 9-12 students, 806 grade 6-12 teachers, and 1,018 parents of students in grades 6-12 conducted between June and August 2025.