This year, district ed tech leaders will need to use a sharper and more critical lens when vetting new apps and services, especially those powered by artificial intelligence, according to experts in the field.
Meanwhile, the ed tech companies schools work with are likely to face more aggressive accountability measures at both state and federal levels as lawmakers focus on enacting stricter student data privacy measures, say education technology experts.
In 2025, the U.S. Department of Education closed its Office of Educational Technology and later ramped up support for AI tools in K-12 classrooms — but whether schools will be able to implement AI tools successfully and at scale remains to be seen. A Digital Promise analysis of 32 states’ AI education policies, released in December, found that only a handful of states and localities are exploring small-scale pilots measuring the impact of expanding AI tools for student outcomes.
While AI tools continue to rapidly develop and lawmakers explore ways to hold companies accountable for student data security, here are three key trends K-12 technology experts expect to play out in 2026.
Schools will need to be more critical of AI tools amid tightening budgets
As school districts face more budget challenges amid ongoing enrollment declines, it’s likely schools will have to make tough decisions as they purchase AI tools, said Keith Krueger, CEO of the Consortium for School Networking.
Krueger said he also expects schools will soon have to grapple with paying for AI tools that were once free to them.
In 2025, Denver Public Schools carved out local funding to pay for MagicSchool AI, a platform with tools for lesson planning, differentiating instruction and communicating with families. The district made the investment based on teacher requests for an ethical and safe AI system to use in their classrooms, said Luke Mund, the district’s ed tech manager.
Now that AI is increasingly a part of education and more specific use cases are emerging, Mund said, he expects there will be more funding for AI tools in the near future, “because it’s what our teachers want.” Denver Public Schools won a Gates Foundation grant in 2024 for AI in math instruction and is actively applying for grants to fund additional AI initiatives, he said.
As schools look for ways to fund their AI initiatives — whether through their own budgets or outside resources — Krueger and Mund both said it will be crucial for district ed tech leaders to scrutinize the kind of large language models that AI ed tech tools rely on in 2026, especially when it comes to protecting sensitive student data.
According to Mund, there’s a lack of transparency among some ed tech companies selling AI tools as to how their models are trained, where that data comes from and how it is stored.
“These AI companies are so good at scraping data and saving and retaining for future models, and we just cannot have that with our student information,” Mund said. Student data privacy is paramount to "everything that we do. We cannot have a 3rd grader’s writing end up in an LLM in the future — or their personal and private information.”
More accountability for ed tech providers
School technology leaders aren’t the only ones worried about protecting student information in the digital world. Ongoing state investigations, along with federal settlements over high-profile data breaches at ed tech providers such as Illuminate Education and PowerSchool, could be signs of more enforcement to come in 2026, technology experts say.
“At a minimum, if you have data of users that are under the age of 13, I think you should proceed very carefully,” said Tyler Bridegan, a former legal advisor at the Federal Communications Commission and now a partner at international law firm Womble Bond Dickinson. “There’s a huge appetite at every level of government to do more in this space.”
In April, companies must start fully complying with updates made last year by the Federal Trade Commission to the Children’s Online Privacy Protection Rule.
Schools should expect to find more details in their ed tech contracts, as the new COPPA Rule requires vendors to provide direct notice to schools as to how they plan to collect and use children’s data once they receive the district's consent to do so. Companies will not be allowed to indefinitely hold children’s data, and they must also limit how long they retain children’s data.
In addition,18 bills that would implement a variety of new federal online protections for children and teens advanced to the full House Energy and Commerce Committee in December.
If those bills gain momentum in Congress, and other state-level proposals move forward as well, Mund said it will be increasingly important for school districts to arrange working groups that track legal developments in student data privacy and AI regulations.
Schools will remain vulnerable to cyberattacks as federal resources dwindle
In 2025, the Trump administration eliminated critical federal resources that were used to support school districts’ cybersecurity measures, Krueger said.
That included the discontinuation of K-12 cybersecurity programs offered through the Multi-State Information Sharing and Analysis Center, which provided free supports to help schools monitor and block malicious threats to their networks.
Another hit came with last year's closure of the federal Office of Educational Technology, as well as the shuttering of K-12 cybersecurity work groups at the departments of Education and Homeland Security, Krueger said.
“So unfortunately, more and more school districts and states are on their own to figure this out,” Krueger said.
And as a result, he added, that schools will become more vulnerable to cyberattacks in 2026. Krueger said there are fears that cybercriminals will use AI to target schools, but he also said under-resourced schools may be able to tap into AI to better identify their own network vulnerabilities.
Meanwhile, a final rule is expected in May from the Cybersecurity and Infrastructure Security Agency to enforce the Cyber Incident Reporting for Critical Infrastructure Act of 2022.
Under this rule, which was proposed during the Biden administration, school districts with 1,000 or more students — along with all state education agencies — would be required to report a disruptive cyber incident to CISA within 72 hours of its occurrence or within 24 hours of paying a ransom to cybercriminals. There are currently no federal requirements for schools to report such incidents, though a small and growing number of states are doing so.
When CISA first proposed the rule in 2024, ed tech leaders stressed the importance of preparing their districts’ administrators for a federal cyber incident reporting requirement.
But regardless of what the final rule requires, Krueger said, the K-12 sector remains highly vulnerable — with few protections and plenty of sensitive, valuable data — something that cybercriminals are drawn to.