Published Date : 30/10/2025
A growing number of special education teachers are leveraging artificial intelligence platforms to draft all or part of students’ Individualized Education Programs (IEPs), even as many districts lack clear policies on the use of this rapidly evolving technology. Educators have long reported struggles with the extensive paperwork associated with IEPs, which can detract from time spent on instruction and collaboration.
Now, AI platforms are seen as a potential solution, helping teachers write these federally mandated, personalized documents more quickly and with greater detail. This allows them to focus more on teaching. However, the use of AI to create IEPs opens up a host of practical, ethical, and legal questions, often left unanswered due to a lack of official guidance.
Olivia Coleman, an assistant professor of exceptional student education at the University of Central Florida and a co-principal investigator at the Center for Innovation, Design, and Digital Learning, highlights the challenges faced by special education teachers. “Teachers, and especially special education teachers, are overwhelmed by paperwork, and it’s crowding out time for instruction and collaboration,” she said. “Many report that they just don’t feel prepared to independently write IEPs after completing their teacher-preparation programs. AI is extremely appealing as it can save time and reduce that cognitive load,” she added.
Coleman, who previously worked as a special education teacher, is optimistic that AI could help teachers create stronger IEPs, but only if used responsibly. She emphasizes that AI should serve as a “writing partner” rather than an autopilot for generating these important documents.
One of the biggest concerns is that large-language models like ChatGPT are trained using literature that doesn’t adequately or accurately reflect the experiences of people with disabilities, creating the threat of bias in their outputs. Additionally, educators could violate student privacy laws if they input sensitive data like test results into unsecured platforms. AI apps also sometimes fabricate studies or misrepresent their findings.
According to a recent survey conducted by the Center for Democracy and Technology (CDT), 57% of special education teachers who responded said they used AI to help them with IEPs or plans to accommodate students’ disabilities under Section 504 of the Rehabilitation Act of 1973 during the 2024-25 school year, up from 39% in 2023-24. The survey of 1,018 parents and 806 teachers, including 275 special education teachers, was conducted between June and August.
The survey found that 15% of respondents used AI to write IEPs or 504 plans in full, up from 8% the previous year. Thirty-one percent said they use AI to identify trends in student progress for goal setting, 30% use it to summarize IEPs and 504 plans, and 28% use AI to help choose accommodations. As in other parts of education, the use of AI in special education comes as districts struggle to draft policies and create professional development to train teachers in appropriate use of the technology.
Just two states—Ohio and Tennessee—have adopted requirements for districts to create AI policies, according to an Education Week tracker. Thirty-three states have guidance on AI in schools, but this guidance largely focuses on student use of AI rather than teacher use and varies widely. While many states address concerns about student data privacy, Georgia has one of the only mentions of IEPs specifically. The state’s guidance advises educators not to use AI for “high stakes” purposes, like IEPs.
“Streamlining administrative processes at the detriment of the human element can lead to mistrust and challenges associated with AI’s ethical use in the classroom setting,” states the document, issued in January 2025. “For example, a teacher may find using AI to write IEP goals as a benefit to save time, but the parent/guardian of a student with a disability might view the use of AI as disconnected from the individual needs of their child.”
Across disciplines, just 22% of the 806 6th through 12th grade teachers who responded to the CDT survey said they had received any training or guidance on the risks of AI, such as inaccuracy or bias in outputs.
There is limited data on how teachers use AI for IEPs. Reports of usage range from individuals using consumer platforms like ChatGPT or Claude on their own to entire districts vetting and purchasing AI platforms to aid in writing the documents. On online message boards, teachers trade tips and prompts. Some educators have created IEP tools on ChatGPT that explain requirements for the document and help draft each component.
On a more formal level, ed-tech companies like Magic School AI and Playground IEP have developed specialized platforms to help write IEPs, coordinate meetings, track data, and adapt lessons to fit students’ goals and accommodations. These programs can be vetted and adopted districtwide. “District-approved tools are going to be safer because, when the districts enter into those agreements, they are doing their due diligence to make sure their students’ information is being safeguarded,” Coleman said.
The CDT survey included responses from 336 parents of a child with an IEP or 504 plan, and 64% said it is a “good idea” for teachers to use AI with “developing or informing” the creation of the special education plans. This finding may surprise some educators because special education plans involve sensitive, private student data, and parents have long complained that, too often, they amount to check-the-box, boilerplate documents that don’t provide meaningful objectives to help their children grow.
“Those are valid concerns,” Coleman said. “But they’ve been concerns long before AI.” The software many teachers have long used to write IEPs offers drop-down menus of pre-written goals they can easily drop into the documents with the click of a mouse. Coleman is currently reviewing the quality of 1,100 anonymized IEPs written before the advent of consumer AI platforms and can often identify when a cluster of plans were written by the same teacher because they use repetitive language and objectives.
Coleman and Danielle A. Waterfield, a doctoral student at the University of Virginia, published research on how teachers use AI for IEPs and the quality of resulting documents in February in the Journal of Special Education Technology. Their theory is that AI could help carry some of the cognitive load for stressed teachers, freeing up time and mental capacity to teach more effectively. They found that when experienced special educators compared IEP goals written by teachers to goals written with the help of ChatGPT, there was no statistical difference in their ratings.
In a similar 2024 study by researchers at the University of North Carolina, goals written by teachers who’d been advised how to use ChatGPT were reviewed more favorably than those written by a control group of teachers who did not receive training.
Teachers told researchers that creating individualized goals that are measurable, achievable, and time-bound is one of the most challenging parts of writing IEPs. This is why they see promise in AI serving as a tool, not a replacement for their personal judgment. The technology can help them by synthesizing notes in a student’s file, analyzing data, and suggesting ways to measure progress. In follow-up interviews, teachers also indicated a need for more training on how to use AI responsibly and effectively.
Organizations like the Center for Innovation, Design, and Digital Learning offer online “office hours” where educators can speak with professors and discuss practical and ethical questions. Coleman and Waterfield have also developed a framework for the ethical use of AI in IEPs that is currently under review for publication. It includes a decision tree to help educators weigh ethical considerations. Among its recommendations:
- Plans should be reviewed and revised to ensure they are personalized and accurately reflect student data.
- Educators should use a checklist, included in the framework, to check for bias in their drafts.
- Educators should disclose the use of AI to parents, students, and other members of the IEP team, indicate which content is AI-generated, and document edits to show how they arrived at the final document.
- Educators should document and track the effectiveness of the prompts they use to create learning goals.
“We don’t want the loss of individualization,” Coleman said. “AI can help you produce a draft, but it should not be your final draft.”
Special education teacher workload concerns drive rising interest in AI tools. Special education teachers’ concerns about heavy workloads and inadequate support contribute to high turnover, said Elizabeth Bettini, an associate professor of special education at Boston University who studies special education teachers’ working conditions and morale. “It’s not surprising” that those teachers are turning to AI, Bettini said. “Special education teachers are overwhelmed. They have pretty extensive paperwork obligations, and there’s no time in the day set aside for that.” Bettini has found that special education teachers work about 10 hours a week outside of the school day, largely on paperwork and case management. And, though IEPs are a significant part of their work, many schools don’t set aside any time to create them.
Q: What are Individualized Education Programs (IEPs)?
A: Individualized Education Programs (IEPs) are federally mandated, personalized documents that detail the educational goals and support services for students with disabilities. They are designed to ensure that each student receives tailored instruction and resources to meet their unique needs.
Q: How is AI being used to write IEPs?
A: AI platforms, such as ChatGPT, are being used by special education teachers to draft IEPs more quickly and with greater detail. These platforms can help synthesize student data, suggest goals, and provide templates to streamline the process.
Q: What are the main concerns with using AI to write IEPs?
A: The main concerns include potential bias in AI-generated content, violations of student privacy if sensitive data is input into unsecured platforms, and the ethical use of AI to ensure that IEPs remain personalized and effective.
Q: How can teachers use AI responsibly for IEPs?
A: Teachers can use AI responsibly by reviewing and revising AI-generated content to ensure personalization, using checklists to check for bias, disclosing the use of AI to parents and the IEP team, and documenting the effectiveness of AI prompts.
Q: What is the current state of AI policies in schools?
A: As of now, only two states—Ohio and Tennessee—have adopted requirements for districts to create AI policies. Thirty-three states have some form of guidance on AI in schools, but this guidance varies widely and often focuses on student use rather than teacher use.