For generations, school districts have adapted to new technology in the classroom – from overhead projectors to iPads. Now, artificial intelligence (AI) is transforming education more profoundly than anything before it. But how do districts move past buzzwords to real, meaningful change?
Vera Cubero has made it her mission to answer that question. As the emerging technologies consultant for the North Carolina Department of Public Instruction, she guides her state’s educators and leaders through the practical and inspiring process of bringing AI into the industry. North Carolina was the fourth state to adopt such guidelines, and currently, 26 states have published guidelines.
She sees a lot of good news on the horizon, as well as a lot of challenges.
That’s because before she became a state leader in AI implementation and a respected voice on the national level, she spent 14 years in a classroom teaching middle school English language arts and social studies, followed by leading technology innovation at the school and district levels. So, her AI literacy training is homegrown, if you will.
Cubero has deep and wide experience leveraging AI to customize learning or identify the need for early intervention. And these AI skills are valuable when it comes to addressing school business challenges like streamlining processes, addressing declining enrollment, enhancing school safety and maintaining facilities.
While AI holds enormous potential, Cubero stresses the need for keeping humanity at the center of education. “While AI can write essays and answer quiz questions in seconds, it cannot replicate the messy, beautiful process of students wrestling with complex problems, collaborating with peers and creating solutions that matter to their communities,” she notes.
No wonder influential people such as Pat Youngpradit, CEO of Code.org; Sal Khan, CEO of Khan Academy; and Amanda Bickerstaff, CEO of AI for Education, have recognized North Carolina’s AI guidelines as some of the best in the nation.
Using AI responsibly is a big hallmark for Cubero, and the EVERY framework she created with Bickerstaff of AI for Education to guide school administrators, educators and students is based on common sense:
EVALUATE the initial output to see if it meets the intended purpose and your needs.
VERIFY facts, figures, quotes and data using reliable sources to ensure there are no hallucinations or bias.
ENGAGE in every conversation with the GenAI chatbot, providing critical feedback and oversight to improve the AI’s output.
REVISE the results to reflect your unique needs, style and/or tone. AI output is a great starting point but shouldn’t be a final product.
YOU are ultimately responsible for everything you create with AI. Always be transparent about if and how you used AI.
“Remember that, as an education leader, your procurement dollars are your power,” she said in a panel discussion at a recent education conference – and that involves engaging with vendors and requesting revisions.
She holds herself to the same standards, emphasizing vetted, trusted and K-12-centered resources for school leaders and training in her state’s AI guidelines. The focus, she says, is to cut through the noise of lesser quality material available online, while protecting students’ and educators’ privacy. She also encourages education leaders to focus on building AI literacy and fluency before investing in tools.
We sat down with Cubero to discuss how AI is poised to improve education, how to make sure your district effectively adopts its use and key challenges to steer through.
What’s one thing you thought you knew that you found out you were wrong about?
In 2018, I completed ISTE U’s Artificial Intelligence Explorations course, which was my first introduction to AI, and I was excited about AI’s potential. I assumed AI would have a significant impact on my future grandchildren’s education, but I also thought we had decades to prepare for it. I was wrong about that!
A few short years later, in November 2022, ChatGPT publicly launched, and its rapid adoption shocked me and proved me wrong. ChatGPT reached one million users in five days, crossed 100 million monthly users by January 2023, hit 400 million weekly users by February 2025, and doubled to 800 million weekly users in April of 2025 after they released their 4o model.
Analysts now predict weekly usage will reach one billion by the end of 2025. In addition to its rapid adoption, thousands of other generative AI tools have flooded the market, and the capabilities have accelerated exponentially over the past three years, with no slowdown in sight. Every day, a new development gives me hope in AI’s potential to solve some of humanity’s most pressing issues, such as finding a cure for cancer or saving lives by addressing climate change.
At the same time, new developments like deepfake videos or AI-enabled scams scare me and make AI literacy a pressing civic issue that education leaders cannot afford to ignore. This technology is unlike any that came before and impacts not only education, but every facet of our lives.
If you could give one piece of advice to someone starting out in the education field, what would it be?
My number one piece of advice is this: Break free from the past to build the future.
As an educator, my advice to anyone stepping into the education world, including school business officials, is this: Don’t replicate the systems you experienced as a student. Those systems were built for a different era, one where information was scarce, jobs were predictable and success came from following instructions. That world no longer exists.
Today’s students are entering a world transformed by technology, especially AI. They carry powerful tools in their pockets that earlier generations couldn’t imagine. Preparing them for this new reality means rethinking not just teaching methods, but also the structures, policies and investments that support learning.
Visionary school business officials are vital in shaping needed changes to our outdated education systems to ensure they prepare students for their future. The future belongs to adaptable learners who think critically, collaborate with both humans and machines, and understand the potential and limitations of emerging technologies. Transformation must extend beyond the classroom. The physical and procedural environments in schools need to adapt to this reality. It must include how we design learning environments, allocate resources and support innovation in our schools in a quickly evolving world.
While my work has focused on supporting educators and education leaders, I’ve come to deeply appreciate the role school business leaders play in enabling forward-thinking education. You’re not just managing budgets, you’re co-architects of what learning can look like. When you advocate for infrastructure that supports more educator and student agency, invest in tools that enable student creativity and create flexible spaces that foster collaboration, you’re helping to prepare students for a future we can’t fully predict, in which many will hold jobs that do not exist today.
So, whether you’re in a classroom or a district office, remember – you’re not just serving a system, you’re shaping the conditions for young minds to thrive in a rapidly evolving world. That’s an extraordinary responsibility and an even greater opportunity.
School business officials are co-architects of the future of education.
You helped shape North Carolina’s AI guidelines. What advice would you give other states or districts trying to balance innovation with responsible AI governance?
Do not confuse policy with guidelines. Policy sets the legal guardrails, but it is difficult to change as the landscape changes. Guidelines must act as a living road map that shows every stakeholder – including students, families, teachers, tech teams and education leaders – how AI will be used safely and responsibly, and how each group will move from AI aware, to AI-literate and, finally, to AI-fluent.
Convene a diverse writing team, publish findings openly, revise quickly and then scale. Create a plan to revisit and revise often as new developments happen almost daily, and the guidance will become stale if it’s not frequently updated to reflect the rapid progress in AI tools, new capabilities, new risks and increased understanding. North Carolina labels its recommendations a “living document” for exactly this reason.
What are the most pressing challenges schools and districts face when trying to implement AI effectively and equitably?
The “Homework Apocalypse”
Almost any traditional assignment can now be completed by AI in seconds. Detectors are not only unreliable but also inequitable, and banned tools are a
phone-tap away for students. Instruction, assessment and academic integrity practices must evolve quickly.
Access gaps
Quality devices, bandwidth and paid AI tools with increased capabilities and better privacy protection and security remain uneven. Access to AI literacy and the opportunity to learn to work effectively alongside AI tools is the new digital divide and must be addressed or our lower socioeconomic-status students and underfunded schools will be left behind.
Professional learning
Teachers need job-embedded coaching to turn “wow” moments into purposeful practice and modeling for students.
Data privacy and security
Families will only support AI when they trust districts to protect student data. Vet tools carefully and be transparent with parents about what tools will be used and how their students’ data is protected.
Future-ready skills
AI fluency is a necessity, not an option. According to the World Economic Forum’s Future of Jobs Report for 2025, 86% of employers expect AI and information-processing tech to transform their business by 2030, and they expect employees to be able to work alongside AI. Schools cannot prepare students with the skills to work alongside AI while concentrating efforts on preventing them from using AI or detecting its use.
These two priorities cannot exist simultaneously, so we must make a choice. We should shift our focus to teaching students to responsibly work alongside AI because that is what their future will demand.
What common misconceptions or fears about AI in education do you encounter, and how do you address them?
Misconception 1: “AI is just a timesaver.”
This mindset limits AI to automation, using tomorrow’s tools to do yesterday’s tasks faster. If we stop there, we miss AI’s transformative potential. In the classroom, this mindset results in digitized worksheets or AI-generated multiple-choice quizzes, with no change to the student experience. But AI can also be a creative and cognitive partner, helping students generate ideas, explore new questions, design community-centered solutions and model complex systems.
The same is true beyond the classroom. For school business leaders, AI is not just about streamlining payroll or scheduling buses more efficiently, though it can do that, too. It’s also about leveraging predictive analytics to forecast enrollment trends, using natural language processing to analyze parent feedback at scale, or simulating budget scenarios to align spending with student-centered goals. When AI is used thoughtfully across a school system, it unlocks time, insight and capacity for everyone involved in supporting student success.
Misconception 2: “Using AI is cheating.”
This fear is rooted in assessment models from a pre-AI era. Rather than try to ban AI, we need to redesign learning and evaluation. That means shifting from static outputs to dynamic processes. Students can be asked to document their AI use; explain their thinking; reflect on choices made; and demonstrate understanding through live discussions, peer feedback and revision cycles. These aren’t just ways to prevent cheating; they’re pathways to deeper learning and ethical AI fluency.
District leaders can mirror this shift in their own practices. Instead of fearing AI will be misused by staff or students , they can focus on modeling responsible implementation. For instance, some districts are already using AI to support grant writing, flag budget anomalies or conduct sentiment analysis on staff surveys. These tools don’t replace professional judgment; they augment it.
By fostering a culture of transparency and digital citizenship, leaders help set the tone for how AI should be used ethically and effectively throughout the system.
Misconception 3: “AI will replace teachers.”
This narrative overlooks what educators do best. No AI can read a classroom’s energy, notice a quiet student’s unspoken need or build trust with families. AI can process and synthesize data, but educators process humanity. The real opportunity is in partnership: When we offload repetitive tasks to AI, we give teachers more time to do what only they can – nurture curiosity, guide critical thinking and foster human connection.
This same logic applies to district leaders. When AI takes on repetitive, time-consuming tasks like compiling reports, analyzing transportation logistics or responding to common HR inquiries, it frees up time for strategic thinking and community engagement. AI isn’t here to replace roles but to elevate them.
The question isn’t if AI will change education; it already has, and we must adapt. The real question is whether we’ll shape that change with intentionality, equity and vision. Teachers, administrators and school business leaders all play a role in ensuring that AI doesn’t just improve efficiency but helps us reimagine learning environments where every student can thrive.
The question is, will we lead that transformation, or will we let it happen to us?
How can schools foster a culture of responsible AI use among students and staff?
Provide ongoing, job-embedded professional development.
Encourage teacher collaboration so colleagues can share successes, struggles and solutions.
Model transparent, critical use.
Educators should teach, model and reinforce real-time evaluation of AI output, cultivating student skepticism and discernment.
Embed AI literacy across all subjects in age-appropriate ways, from early media fluency lessons to advanced prompting and AI-enhanced project-based learning (PBL) and design thinking projects in which students create AI tools to solve issues in high school.
Co-write AI integrity pledges.
Link them to portfolio evidence.
Create student AI ethics boards.
To help shape guidelines and flag blind spots.
From your vantage point, how is AI changing school systems beyond the classroom, and where will disruption land next?
Operations are already being rewired. In Colorado Springs, AI-optimized routing cut nearly 50% of bus routes and improved on-time arrivals during a driver shortage.
Special education teams are testing AI drafting tools to streamline IEP paperwork, while ensuring data privacy protection, freeing time for face-to-face
support.
Next up: Predictive analytics for procurement and energy management, adaptive scheduling that aligns staffing with real-time needs and AI-guided career pathway counseling tied to local labor market data.
You’re part of several national and regional efforts around AI in education. What promising models or partnerships could be replicated elsewhere?
One of the most promising patterns I’ve seen across national and regional efforts is the shift from AI implementation as a tech initiative to AI as a cross-functional transformation, bringing together educators, school business leaders, IT directors and community voices. When these partnerships are educator-driven but operationally supported, the results are powerful.
In North Carolina, for example, we’ve taken a statewide approach to AI guidance that frames responsible implementation not just as a classroom issue, but as a systemwide opportunity. The NC Generative AI Guidelines emphasize ethical use, transparency and differentiated access aligned to student age and educator roles. What makes this replicable is that it doesn’t rely on a one-size-fits-all policy. Instead, it empowers local districts to develop plans that make sense for their context, whether that’s investing in teacher training, upgrading digital infrastructure or developing AI-informed procurement strategies. That kind of flexibility is only possible when school business officials are part of the planning process from the start.
Nationally, I’ve seen promising partnerships between states and non-profits, such as EdSAFE AI Alliance, The AI Education Project, Code.org, Day of AI and others, as well as partnerships with responsible companies that are dedicated to responsible AI integration such as AI for Education and PBL Future Labs – all focused on guiding schools and districts to integrate AI into authentic learning experiences in safe, equitable ways.
These models thrive when supported by district leaders who align budgeting, staffing and scheduling decisions to enable deeper learning. For example, rethinking traditional bell schedules or funding dedicated instructional coaches can unlock space for meaningful experimentation.
I’m also encouraged by emerging public-private partnerships, like collaborations with AI companies that prioritize educator co-design and equity of access. However, to replicate these responsibly, school business officials play a critical role in ensuring such partnerships prioritize student privacy, long-term value and mission alignment – not just short-term tech gains.
It’s also important for school business leaders to understand that most employees and students are already using AI, with the majority using free tools with very little data privacy protection that train on their data. Providing them with subscriptions to education versions of the tools ensures higher data privacy and security, and guarantees that their conversations with AI will not be used to train the models.
Ultimately, the most promising models I’ve encountered share a few key traits:
- Educators, administrators and school business leaders work side-by-side from the beginning.
- AI implementation is tied to deeper learning goals, not just operational efficiency.
- There’s a strong commitment to equity, transparency and adaptability.
If we’re going to lead this transformation rather than react to it, we need partnerships that blend instructional vision with operational foresight. That’s why school business officials aren’t just enablers, they’re co-architects of a more responsive, future-ready education system. The future of AI in education won’t be built in silos. It’ll be built at the intersection of pedagogy, policy and operations – working together.
What’s one thing about AI in education that you are most optimistic about, and one thing that has you concerned?
I’m most optimistic about how AI can accelerate the shift toward standards-aligned micro PBL that equips students with durable skills and AI fluency now, not years from now. What excites me is that this transformation doesn’t require massive overhauls or long policy cycles. It’s scalable, adaptable and cost-efficient when supported strategically.
School business leaders play a vital role in enabling this shift, whether that’s by aligning investments with resources to support the shift to more authentic learning, rethinking scheduling to allow interdisciplinary collaboration, or supporting professional learning that helps teachers build their own AI fluency. When operational strategies align with instructional innovation, we can deliver real impact at scale, preparing students for an AI-rich future, while also being good stewards of public resources.
My biggest concern is the rise of deepfakes and synthetic media that threaten trust in what we see, hear and believe.
In a world of AI-generated misinformation, AI literacy is a civic imperative, not just an educational upgrade. Ensuring all staff, students and parents understand how AI works, how it can deceive and how to use it ethically is foundational to safeguarding democracy, public discourse and responsible technology use.
We can’t afford to treat it as an optional add-on.
Julie Phillips Randles is a freelance writer based in California.































