Subscribe to my ThriveinEDU newsletter to stay informed. (If you receive my newsletter, you may have read this, but just in case…here is part I)
Over the past eight months, I’ve had the opportunity to work with educators, school leaders, and district teams from twelve districts across the country as they navigate one of the biggest shifts education has experienced in decades: the arrival of artificial intelligence in everyday teaching and learning. This work is part of a national digital wellness and innovation initiative supporting districts as they develop responsible approaches to emerging technologies.
I work with a Task Force from each district to evaluate policies, create resources for families, and decide when and how to begin teaching students about AI, as well as how best to support educators. And some of these Task Forces include students and parents. We have had many conversations about digital wellness, digital citizenship, screentime, and, of course, AI.
The conversations about AI included shared concerns, questions, and challenges. However, what has stood out the most in these conversations with these schools is not fear. It’s curiosity.
In classrooms, teachers are asking thoughtful questions about how AI can support student thinking rather than replace it. Administrators are working to align emerging tools with existing priorities such as digital citizenship, academic integrity, and student wellness. District teams are exploring how policy can move beyond restriction toward responsible guidance. Some are even completely rewriting their policies to align with these changes and make sure that a common language is used.
Recently, my work has included:
• Supporting district digital wellness and AI implementation planning
• Leading professional learning sessions on responsible AI use
• Presenting on AI and the law for educators
• Visiting classrooms to observe how students are already interacting with AI tools
• Collaborating with leadership teams and developing next-step strategies for staff support
• Designing activities for administrators and educators to evaluate policies and effective AI use
One consistent theme continues to emerge:
Districts, educators, and students are ready to lead.
Educators are not waiting for perfect answers to the big AI questions. They are considering the best pedagogical practices for using AI that protect students while expanding opportunities.
The most successful districts I’m working with right now are focusing on three priorities:
Supporting educator confidence: They need clarity, examples, and time to explore.
Creating shared expectations for responsible use across classrooms and grade levels
Preparing students to think critically about AI-generated information.
Artificial intelligence isn’t just a technology conversation.
It’s a leadership conversation.
And I’m excited to continue working with and learning alongside school districts as they move forward with clarity, purpose, and a strong commitment to keeping human relationships at the center of innovation.
Providing the training
Artificial intelligence is changing expectations across nearly every profession. Schools are not the only organizations preparing for this shift.
In my work as an educator, attorney, and national presenter on responsible AI implementation, I support organizations as they explore how AI connects to decision-making, ethics, communication, and everyday professional practice.
I help schools and other organizations (law firms, healthcare professionals, business owners) implement AI responsibly through policy guidance, professional learning, and classroom-ready strategies grounded in both instructional practice and legal insight.
My sessions focus on helping teams:
• understand what AI can and cannot do
• recognize responsible-use considerations
• build confidence using emerging tools
•align implementation with organizational priorities
If your school, district, or organization is beginning conversations or looking to dive in and learn more about AI policy, professional learning, or responsible implementation, I’d welcome the opportunity to support your next steps through leadership workshops, keynote sessions, or strategic planning partnerships.
Preparing people is what makes AI implementation successful.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
Not long ago, artificial intelligence in education felt novel. It was something shiny, experimental, and, for many educators, possibly unsettling at times. When ChatGPT arrived in November 2022, the initial conversations and concerns were more focused on fear. I recall receiving emails, text messages, phone calls, and visits from educators who were concerned about cheating, plagiarism, lost skills, and what instantly felt like an overwhelming pace of change. It was something else to adjust to, not long after the overwhelming feeling that many felt in March of 2020.
But since that initial adjustment to the increased use of AI in our world at the end of 2022 and through 2023, I’ve seen a shift happening. At first, there was skepticism, uncertainty, and hesitation, and not just in the world of education. However, as we’ve continued to adjust to new tools and new ways of working, I’ve noticed a shift from considering AI as a “what if” to the acceptance that AI is here and its use is increasing. It’s embedded in tools educators already use, and if it hasn’t already, then it will potentially slowly but surely become part of the daily routine and workflow of teaching and learning.
I’ve spoken about this shift from novelty to normalcy and how it brings a new challenge: educator upskilling.
A few years ago, I started researching the training available to educators and other professionals in AI. At the end of 2023, 87% of the educators in the United States had not received any training. In my workshops, some attendees are having their first training experience, more than 3 years after ChatGPT made its debut. So I think that we need to focus on an important question, whether in education or not. The question is no longer whether educators need professional learning around AI. Most people agree that they do. The bigger issue is whether we are approaching AI professional development in ways that are deep, sustained, and human-centered, or whether we’re still experiencing the one-and-done sessions that barely scratch the surface. With AI and the pace of change in education and the world, we need to do better and be prepared.
Shifting to Ongoing Capacity Building
When I completed my doctorate nearly two years ago, my research focused heavily on professional learning in emerging technologies, with a strong emphasis on AI. Even then, the message was clear. A single PD session, or even a series of short, tool-based trainings, was not enough, especially if completed early in the year or during a limited time span.
Yet, that is what I am learning about how AI PD is structured today. Through surveys in my sessions and conversations with other educators, there is a common experience happening, which is:
A 30-minute overview.
A 15-minute “certified educator” badge.
A walkthrough of one tool done well.
While these experiences can be helpful, especially for getting started and when time is limited, in the long term, they don’t build AI literacy. They build familiarity, whether with AI concepts or an AI tool. But familiarity is not AI literacy. Not for us as educators, nor for the students we are preparing for a future surrounded by AI and a world of work that seeks employees skilled in AI.
Continue reading the original post on Getting Smart.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
During our ThriveinEDU livestream conversation about Kira, we explored a question that immediately resonated with educators:
What if planning, grading, and differentiation actually took half the time and still kept teachers in control of learning?
The question isn’t just about efficiency. It’s about sustainability and about supporting teachers to make instruction more responsive, more personalized, and more aligned to what students actually need in the moment, real-time responses, authentic feedback, and support from their teachers.
Kira recently released several new features (as part of their Kira 2.0 launch) that move beyond treating AI as a “lesson generator” or “assessment creator,” and it now works as a thought partner in the instructional workflow. After attending the Live Launch in New York on March 3rd and moderating the livestream, here are some of the biggest takeaways from the conversations that make the newest updates especially impactful for classrooms now.
Lesson/Course Studio
Many AI tools help teachers create one lesson at a time, which is highly beneficial and time-saving. But imagine you’re tasked with creating a course you’ve never taught or don’t have enough resources for. The amount of time needed is a bit overwhelming.
Kira’s Course and Lesson Studio helps educators generate both structured lessons and full, standards-aligned courses, including course outlines, unit sequences, lesson progressions, and assessments
Educators need to provide the topic, subject, grade level, and standards, and then, using this information or prompt, Kira builds the lesson with embedded formative checks already in place.
Formative assessment often happens after instruction, with Kira, teachers see student understanding during instruction.
As Rachel shared during the livestream:
“I don’t remember a time when I wasn’t taking work home or trying to get ahead of the game by planning out my week and then having to rewrite it midweek. It was so much work.”
Kira’s curriculum-building features help reduce that cycle in far less time. Rather than rewriting lessons to meet student needs, teachers start with a flexible structure they can adapt immediately, and, most importantly, stay in control. We are doing the editing, adjusting, and shaping of the lesson. This is an important distinction to make because it shows how crucial it is that teachers remain involved and review what has been generated.
Real-Time Insight Instead of End-of-Unit Surprises: Student Atlas
I have known about this for a few months and thought it was amazing. One of the most exciting updates in Kira 2.0 is Student Atlas, the platform’s student insight dashboard, now paired with Class Atlas, which brings those insights together at the class level.
Student Atlas provides:
concept-level mastery tracking
data confidence indicators
individual student support indicators
zones of proximal development insights
intervention suggestions
Rather than relying on a single quiz or test score, teachers can see which concepts students understand and where they’re struggling in real time. It enables us to see what concepts need reinforcing now, rather than waiting until the assessment is over and graded.
Class Atlas builds on this by turning individual insights into a clear, actionable class-wide view. Instead of opening 20+ student profiles and piecing things together, teachers can instantly answer: Where should I focus my instruction? and Which students need help with this skill? Teachers can even ask Kira to explain how it generated its recommendations, which helps schools as they look for tools and want to trust AI technologies.
Student Atlas also includes a data confidence indicator, helping educators assess the reliability of recommendations before making instructional decisions. That transparency supports professional judgment instead of replacing it.
Standards Alignment
Standards alignment is often one of the most time-consuming parts of planning, especially when building units or courses. And for educators teaching multiple courses, it is very time-consuming. But with Kira 2.0, that time requirement decreases because Kira 2.0 automatically tags lessons, activities, assessments, and questions to state standards, underlying skill progressions, and Bloom’s taxonomy levels.
Teachers can track how students are progressing through skills over time.
Supporting Multilingual Learners
Another standout feature we spoke about in the livestream is Kira’s built-in support for multilingual learners.
When gaps in understanding appear, Kira can generate:
scaffolded practice
targeted follow-up lessons
leveled reading supports
vocabulary scaffolds
translated instructional materials
Each of these supports is based on individual student performance, and not on a generic template that does not align with the student’s needs.
Differentiation is responsive rather than being reactive.
During the livestream, we talked about how, historically, differentiation required teachers to manually create multiple versions of lessons or assessments, which, of course, took a lot of time. With Kira, these supports are embedded directly inside the instructional workflow. Rachel said, “Especially talking about differentiation and the ease of it and being able to have the assistant nearby and go back and forth.”
Embedded support assists educators in providing what each student needs while giving them more time to work directly with each student.
Kira provides structure, but the teachers are the designers who provide the course’s vision.
Kira brings planning, assessment, differentiation, and student insight into one connected space. And when those pieces connect, teachers gain something incredibly valuable:
clarity flexibility time and better visibility into learning
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
In Part 1, I shared why understanding the legal landscape of artificial intelligence is essential as schools continue to explore how these tools can support teaching and learning. Schools everywhere are thinking through policies and how to best provide resources for educators, students, and families. Awareness of laws such as FERPA, COPPA, and GDPR, accessibility requirements, and concerns such as algorithmic bias and deepfakes set an important foundation for responsible implementation.
We need guidelines and guardrails. A common question I hear from educators and leaders after presenting sessions and workshops, or speaking at conferences, is: “What do we do next?”
Understanding the guardrails is only the first step. The real work begins when schools start building systems that support educators in applying this knowledge in practical, sustainable ways. And it requires true collaboration.
Responsible AI Adoption Is a Team Effort
One of the most important shifts happening right now is the recognition that AI adoption and policy development should not be the responsibility of a single person or a select few administrators or IT teams. Responsible implementation and policy development require collaboration across roles.
District leaders are shaping policy and expectations for the school community.
Technology teams are evaluating vendor compliance and infrastructure readiness. (I have a future post coming up about IT Teams and ongoing PD).
Instructional leaders are aligning tools with learning goals and supporting teachers with implementation.
Teachers are modeling and supporting ethical classroom use.
Students are exploring and developing AI literacy skills that will shape how they interact with technology throughout their lives.
What I truly believe is that when schools recognize AI is a shared responsibility rather than an isolated initiative, implementation becomes more intentional, reflective, and sustainable.
I consistently see this when working with districts across the country. The schools that are moving forward with confidence are not the ones adopting the most tools. They are the ones creating a community, developing a common language, and building shared understanding first.
Transparency Builds Confidence Across the Community
Another theme that has been coming up in conversations with educators and families is trust.
Families want and need to know:
What tools are being used?
What information is being collected?
How is student data protected?
How is AI, or any technology, being used in support of learning rather than replacing it?
Having clear answers to these questions helps to strengthen the essential partnerships between schools and families. It also creates opportunities for students to participate more actively in conversations about responsible technology use.
Transparency is not simply a compliance strategy. It is a relationship-building strategy. When schools communicate clearly and proactively, they reduce uncertainty and help communities better understand how innovation supports student success.
AI Literacy Is Now Part of Digital Citizenship
One of the biggest shifts happening in education right now is the expansion of digital citizenship to include AI literacy. We’ve been talking about media literacy, digital literacy, AI literacy, and even discernment. Our work is a bit more involved now, and we need to be prepared.
Students are already interacting with AI systems daily, both in and maybe more frequently outside of school. They need guidance, which means classrooms must play an essential role in helping students understand:
How to protect their personally identifiable information (PII)
How AI systems generate responses How bias can appear in outputs How misinformation spreads How data is collected and used How to evaluate whether a tool should be trusted
AI literacy is not about teaching students how to use a single platform. It is about helping them develop judgment.
When students learn how to ask better questions about technology, they become more confident learners and more thoughtful digital citizens. Emerging tools continue to shape how students research, communicate, and create, and as educators, we have to keep learning so we can guide them to use the tools available to them safely and successfully.
Accessibility and Equity at the Center
As schools explore AI tools, accessibility must be a part of every conversation.
AI has tremendous potential to support multilingual learners, provide personalized feedback, assist with reading and writing tasks, and help students access content in new ways. It has endless ways to support educators. Schools must continue evaluating whether tools meet accessibility expectations and support equitable learning experiences.
Responsible implementation means asking questions such as:
Does this tool improve students’ access?
Does it create barriers? There has been more talk about the digital divide recently.
Does it support multiple learning pathways?
Does it align with universal design principles? Or a Portrait of a Graduate or an AI-Ready graduate?
Technology should expand opportunity rather than narrow it.
Supporting Educators Through the Transition
One of the most encouraging things I have seen in my work with educators is their investment in learning and the desire to learn with and from their students.
Educators are exploring AI tools while also asking important questions about privacy, ethics, and instructional impact. This balance is exactly what responsible adoption should look like.
Professional learning plays an essential role.
Educators benefit from opportunities to:
Explore tools safely Review privacy expectations Understand policy implications Design classroom strategies Collaborate with colleagues Develop shared language around responsible use
When professional learning includes both legal awareness and classroom application, educators feel more confident making decisions that support students. Confidence leads to stronger implementation. And this is the work I am most passionate about when working with schools.
Leadership Matters More Than Ever
School leaders are in a unique position to support responsible AI adoption by:
Developing clear expectations Supporting cross-team collaboration Communicating with families (consistently) Reviewing vendor agreements carefully Building a common language around the use of AI Creating space for experimentation, but having guardrails in place
Moving Forward
Artificial intelligence is already part of the learning landscape. We should not be talking about whether schools should engage with AI, but rather deciding how they will engage with it.
When schools combine legal awareness, transparency, accessibility considerations, and strong professional learning structures, they create innovative environments built on human decision-making.
Students benefit when educators feel confident.
Educators benefit when leaders provide clarity.
Communities benefit when schools communicate openly.
Responsible AI adoption is about moving forward with purpose.
When schools take that approach and have a team to work with, they are preparing students to understand technology, question it, and be the ones who determine what comes next.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Artificial intelligence (AI) is rapidly transforming education. From lesson planning support to personalized learning pathways and administrative efficiencies, AI tools are a more common part of everyday classroom practices. At the same time, the speed at which this technology has advanced and been adopted into classrooms has led to understandable uncertainty among educators, leaders, and families who are asking important questions. These groups are concerned with the data that is being collected, who owns AI-generated work, and what responsibilities schools have when students and educators use these tools.
As both an attorney and educator who has spent more than eight years researching, teaching, presenting, and writing about AI, I have worked with schools across K–12 and higher education that are navigating these exact questions. The legal implications of AI are not barriers to innovation, but I consider them to serve as guardrails that assist schools with adopting technology responsibly. The key is protecting students, educators, and institutions and staying informed. Understanding the legal landscape and any potential legal implications as a result of the use of AI in classrooms helps schools move forward with confidence rather than hesitation.
Why AI and the Law Matter in Education
AI relies on data in order to function effectively. When it comes to schools, this means having access to student information, classroom artifacts, writing samples, images, and even data related to physical or behavioral information. Intent is not the deciding factor. Even if educators believe they are only sharing minimal information, that does not clearly identify a student, family member, or colleague, even seemingly harmless details can qualify as personally identifiable information (PII).
I’ve often spoken about some examples like referencing a favorite restaurant, a local landmark, a pet’s name, or an extracurricular activity, all of which could make a student identifiable when combined with other data points. Last year, an educator in one of my sessions said, “Enough stars to still form a constellation,” and that has stuck with me and I have shared it in each AI and the Law session I have done. That is why evaluating tools carefully and teaching students to do the same are essential. I often reference scavenger hunts, in that educators should not feel like they are on a scavenger hunt when trying to find out what happens to their information. We need transparency from vendors so that educators are aware and informed.
AI is also changing how decisions are made in schools. With many advances, there are recommendation systems, automated feedback tools, and predictive analytics that can influence learning pathways, grading practices, and student support services. Having an understanding of how these systems work and how they should be used responsibly is becoming part of educators’ and school leaders’ professional responsibilities.
Key Laws That Shape AI Use in Schools
There are several important laws that guide how schools must approach AI.
FERPA (Family Educational Rights and Privacy Act) protects the privacy of student education records. When schools use AI-powered platforms that process student work or store learning data, they must ensure that these tools comply with FERPA requirements and clearly define how student information is handled.
COPPA (Children’s Online Privacy Protection Act) applies to students under the age of 13 and requires parental consent before collecting personal information through online services. Because many AI tools rely on user-generated input, COPPA compliance becomes especially important in elementary and middle school settings.
GDPR (General Data Protection Regulation), although it is a European Union law, is relevant to U.S. schools that use tools developed by companies that operate internationally. There are many platforms created outside of the United States that educators may be unaware of, and so understanding GDPR is essential. Many platforms now include cookie permissions and data-use customization features in response to GDPR requirements. These protections often benefit schools globally.
Schools should also consider state-level student data privacy laws, which are increasingly changing the expectations for vendor contracts, third-party integrations, and data retention timelines. District leaders and IT teams play an essential role in ensuring these requirements are addressed before tools are introduced into classrooms.
Data Privacy and Vendor Responsibility
AI tools require large amounts of data to function effectively. That data may be used to improve the tool itself, train additional models, or support integrations across connected platforms. Even when a tool states that it does not share user data, connected services or embedded features may still interact with stored information. I was asked two years ago, when speaking at LACOE in California during my AI and the Law session, if someone should “trust the platform when it says they do not share or store the data.” My instant answer was “No.” And it was for this exact reason.
Before introducing any AI platform in schools, educators and school leaders should review terms of service, privacy policies, and compliance documentation. Look for references to FERPA, COPPA, and additional privacy protections. Look for the date that the privacy policy was most recently updated. Districts should also confirm whether vendors use student information to train future AI models and whether contracts clearly define ownership and storage expectations.
This is where collaboration with district technology teams becomes essential. Responsible adoption is not an individual teacher’s decision. It is a system-level responsibility supported by leadership, policy teams, and instructional staff working together. Collaboration is key.
Transparency Builds Trust With Students and Families
Responsible AI adoption depends on communication. Families deserve clear explanations of the tools being used, the data being collected, and how that data is protected.
When working with students under age 13, written parental consent may be required. Even when it is not legally necessary, providing families with opportunities to ask questions strengthens trust and partnership. Transparency also empowers students. When students understand how AI systems work and the risks they may pose, they become more thoughtful digital citizens and more informed users of technology.
Schools that proactively communicate expectations for AI use are more likely to build families’ confidence and reduce misunderstandings about how these tools support learning.
Accessibility, Equity, and Emerging Legal Considerations
As schools adopt AI tools, accessibility and equity must remain part of the conversation. Laws such as Section 504 ofthe Rehabilitation Act and the Americans with Disabilities Act (ADA) require that digital learning tools be accessible to all students. If AI-powered platforms create barriers rather than support access, schools may face compliance concerns. We need to consistently audit the tools we are using. It must be an ongoing process.
Schools must also consider how AI intersects with Title IX responsibilities, especially with the rise of deepfake technology, which leads to new risks related to harassment and impacts student safety. Policies must be in place for addressing the misuse of generative AI tools and clearly define expectations and response procedures.
Algorithmic bias and fairness are important parts of the conversation. Schools should evaluate whether AI systems produce equitable outcomes across student groups and whether automated recommendations influence learning opportunities in unintended ways. Responsible implementation includes ongoing evaluation, not just initial approval.
Teaching Digital Citizenship With AI Literacy
Legal compliance alone is not enough. Students must also develop the skills needed to evaluate AI responsibly.
Developing skills in these areas means recognizing risks such as deepfakes and misinformation, bias in generated content, and cyberbullying that is supported by emerging technologies. Schools that integrate digital citizenship with AI literacy will guide students to become thoughtful participants in technology-rich environments rather than passive users who lack true understanding and AI literacy skills.
Clear expectations around appropriate use and academic integrity help students develop ethical decision-making skills that extend beyond the classroom.
Supporting Schools and Organizations Through AI and Legal Guidance
As AI adoption accelerates, schools will benefit from having a structured support system in place that connects legal awareness with thoughtful and purposeful classroom practice. Through my work with educators in K–12 and higher education, I provide professional learning experiences that help schools understand privacy requirements, implement responsible AI strategies, and align classroom applications with policy expectations.
My work includes keynote presentations, workshops, district leadership sessions, curriculum planning support, and customized training focused on data privacy, academic integrity, digital citizenship, accessibility considerations, vendor evaluation, and responsible AI adoption. Each training is tailored to address specific needs, ranging from introductory awareness sessions to deeper implementation planning and leadership strategy development.
In addition to supporting schools and universities, I work with organizations across other sectors to explore how to implement AI responsibly while remaining aligned with legal expectations and organizational values. Many industries face the same challenges that educators do, surrounding uncertainty about data privacy, questions about intellectual property ownership, concerns about transparency in decision-making systems, and the need to develop policies that support ethical innovation. My work helps organizations evaluate tools thoughtfully, identify potential risks early, and create practical guardrails that support responsible adoption rather than reactive compliance.
Organizations in healthcare, legal services, workforce development, nonprofit leadership, and corporate training environments are increasingly recognizing the importance of AI literacy for employees at every level. Through workshops, leadership sessions, and strategy conversations, I help teams understand how AI systems work, the legal considerations that may be applicable to them, and how to build cultures of responsible use that prioritize trust, security, and human judgment.
Moving Forward With Confidence
Artificial intelligence is already shaping how students learn, communicate, and prepare for future careers. The goal is not simply to adopt AI tools, but to adopt them responsibly. And this is where our work as educators comes in and why we need to dive in and learn with and guide our students.
When educators understand the legal landscape of privacy, accessibility, intellectual property, and ethical use, they can make informed decisions that support innovation and student protection. With thoughtful planning, collaboration, and transparency, schools will create learning environments where AI enhances opportunities while maintaining trust, safety, and integrity across the entire school community.
I work with schools and organizations, both in person and virtually, to support thoughtful and responsible AI implementation through professional learning, curriculum design, and resource development specific to educators, students, and families, using a common language. I have also collaborated with leadership teams to develop AI guidance frameworks, classroom-ready activities, and policies that reflect legal considerations.
The resources created help districts communicate clearly and consistently with families about AI use, support educators in building AI literacy, and provide students with age-appropriate strategies for using AI safely, ethically, and responsibly. By combining legal insight with classroom experience, I help schools move beyond uncertainty toward sustainable systems that include clear expectations, transparency, and actionable guardrails for responsible use.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
In collaboration with Learning Genie: All Opinions are my own
If there’s one thing I value in education, it’s authentic and honest conversations about what’s really happening in classrooms. The January and February Learning Latte meetups with Learning Genie were exactly that.
These meetups offered grounded, reflective discussions about teacher preparation, real classroom challenges, and how tools like Learning Genie can support, rather than replace, our professional judgment. And with a focus on UDL, Portrait of a Graduate, and Differentiation, Learning Genie offers everything in one solution!
Here are some takeaways:
January: Teacher Preparation, TPA Season & the “Idea Inventory”
January’s Learning Latte meetup focused on the importance of and value in truly listening to educators.
One of the most important parts of the conversation came from Robert Mayfield, who addressed a challenge that many of us have seen and experienced firsthand: pre-service teachers during the TPA season.
If you’ve worked with student teachers, you may notice the impact of getting started and how they feel about it. They can be:
Overwhelmed
Time-strapped
Focused on and worried about meeting rubric requirements
Relying heavily on pre-existing lesson plans
Trying to survive and balance all of the new tasks that come with our work.
Robert highlighted a key concern: When pre-service teachers rely too heavily on ready-made lessons, they may miss the opportunity to build their own instructional toolkit. That’s where the concept of an “idea inventory” comes in.
What Is an Idea Inventory?
An idea inventory is not just a folder of saved lessons over the course of the school year or years. It is a curated, reflective collection of strategies used, activity ideas, differentiation techniques, assessment approaches, and adaptable frameworks.
The inventory includes:
Multiple entry points for learners
Flexible scaffolding ideas
Variations for different readiness levels
Culturally responsive examples
Developmentally aligned strategies
All of this is especially critical in early childhood and elementary settings, where differentiation is foundational.
The January discussion reinforced what I have noticed when working with other educators. New teachers need to understand how to differentiate effectively and have the resources they need to support their work.
This is where Learning Genie can make an impact. It supports reflective planning and enables teachers to connect observations to instruction. It makes differentiation visible, which is essential.
A good question to consider is: “How do we help future teachers think like designers of learning?”
Learning Genie supports that mindset shift. When teachers reflect on student observations and use those insights to plan intentionally, it helps build professional capacity and confidence. And it builds community when educators and companies connect!
Enjoy learning from and sharing feedback with Dr. Gene Shi
February’s Learning Latte offered a clear view and many insights into a lived classroom experience.
February’s meetup included educators Sandy Ferguson and Gina Ogilvie. Sandy began by sharing classroom experiences, grounding the conversation in real practice rather than theory.
I always want to know the stories of other educators, the why behind the choices in activities, strategies, and tools used in their classrooms, and the impact.
Many conversations about edtech center around the features, dashboards, and integrations. But I’ve long said and heard it in their message. What matters is the impact it makes inside the classroom.
Highlights from Sandy and Gina
Authentic Application The conversation centered on how Learning Genie supports educators’ daily work. It helps with lesson planning, documentation, and communication, and it is easy to navigate and use.
Alignment with Developmental Needs In early childhood, especially, the tools we use must align with how children learn best.
Teacher Confidence When educators feel supported in leveraging technology to provide meaningful and personalized instruction, their confidence increases. Teacher confidence impacts classroom climate and positively boosts student engagement and interest in learning.
What stood out is that technology works best when it amplifies teacher expertise, not when it replaces it. Shifting from replacement to the enhancement and transformation potential of these tools is important. And when it enhances our students’ learning opportunities. Check out this video to learn more.
Connecting January and February: A Common Theme
Both sessions highlighted:
The importance of reflective practice
The need for intentional differentiation
The value of building professional capacity over time
The role of tools in supporting rather than shortcutting professional growth
January focused on building the foundation by helping new teachers develop their idea inventory. February provided a clear view of what this looks like in action, with experienced educators using tools to refine their professional practice and deepen students’ learning impact.
Final thoughts
The best educational tools don’t give us answers. I think that they help us ask better questions.
How are we differentiating?
What patterns are we noticing?
How are we building our “idea inventory?”
How are we supporting new teachers before they burn out?
Use these questions as a focus point, and I think you will find that a tool like Learning Genie is a catalyst for transformational and meaningful instruction and learning.
Enjoy sharing about Learning Genie in Pittsburgh and other conferences and school PD sessions!
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Technology is evolving at a pace we have never experienced before. There have been so many changes in the world through artificial intelligence, automation, data science, and other emerging technologies. These are reshaping industries in real time. As an educator, I feel this shift daily, and I try to push myself to keep learning and looking for opportunities to do more for my students. The challenge is no longer simply preparing students for a job. It’s knowing how to prepare them for careers that may not even exist yet and also supporting them as they develop a variety of skills to be prepared.
When I think about how to prepare students for the uncertainty around the world of work, I look at insights from the World Economic Forum and its Future of Jobs research. While AI was listed as #3 for 2027 and is now listed as #1 for 2030, the other rankings reinforce what we already know: adaptability, analytical thinking, creativity, and resilience are becoming increasingly important in our world.
If we cannot predict the careers that will exist five or ten years from now, the best we can do is prepare students to be flexible thinkers, confident problem-solvers, and ethical technology users. And this is why I believe that career-connected learning is essential.
Redefining “Career Ready”
When I thought about “career ready,” I aligned it with strong academics plus essential skills of communication, collaboration, and the other “soft skills.” These are still relevant and necessary for success, however with the changes in technology, there are other areas that I believe must be addressed and become part of preparing students to be career-ready. remain foundational. Now, I include:
Digital and AI literacy
Ethical reasoning in technology use
Data awareness and cybersecurity knowledge
The ability to evaluate and question AI-generated information
Comfort navigating complex digital systems
Students need to understand how to use tools like generative AI. And that means using it to enhance and not replace their own learning. They can learn to brainstorm with AI, analyze outputs for bias or inaccuracy, and be able to recognize when human judgment must be at the forefront, providing consistent oversight. Research and interviews of employers have shown that employees will be expected to work alongside AI systems. That preparation has to begin in our classrooms from K through 12 and beyond.
Career-connected learning ensures students understand how what they are studying connects with real careers and real-world impact.
Why This Matters Now More Than Ever
According to projections highlighted by the World Economic Forum, millions of roles will be displaced due to automation, while millions of new ones will emerge. This is not the first time. More than 100 years ago, thousands of traffic light controllers in New York were displaced due to automation. They did not all lose their jobs, some shifted into others. And many of these new positions demand higher-order thinking, digital agility, and ethical decision-making.
I like to talk about some career options that minimally existed a few years ago:
AI prompt engineer
Ethical technologist
Data privacy consultant
These are some of the many growing fields of work and some which are increasing because of AI. I think about how we are preparing our students and believe that career-connected learning will help to show the connections between classroom content and workforce relevance. I also believe this is something that can be done in every classroom and in all content areas.
What Does Career-Connected Learning Look Like?
Career-connected learning is more than occasional career days. It is something that is embedded into daily instruction, not an extra element. It can include a variety of possibilities, such as:
Project-based learning connected to community or industry challenges. (Builds relevance for students).
Integration of AI, data science, and emerging technologies
Authentic problem-solving rooted in real scenarios
Partnerships with local businesses, universities, or nonprofits
Coding, AI, and cybersecurity challenges
Through opportunities like these, we can foster the development of student agency. When students understand how what they are learning connects to real opportunities, it sparks curiosity, increases students engagement and motivation. Learning is more purposeful, authentic, and meaningful.
Some ideas:
Artificial intelligence is an area that students need to understand. They need to know, how AI systems function, how to evaluate the outputs, how bias can be embedded, and what the ethical responsibilities are for using AI. In career-connected classrooms, AI might be used to discuss and explore how the legal field, healthcare and business industries, and schools are using AI tools. They can engage in role-playing that focuses on ethical decision-making. The goal is for students to leverage AI as a partner, rather than a replacement in learning.
STEM is a great option to focus on career-connected learning. In my own classroom experiences, I’ve seen what happens when students combine AI tools with engineering design, language learning, and problem-solving. When students train image classifiers and then collaborate, problem-solve, and evaluate where the model fails, they are not just learning about the technology, they are developing skills in critical analysis and bias detection.
Cybersecurity is another area that is seeing tremendous growth. Students need to understand how their data is collected, protected, and in some cases, misused. There are hundreds of thousands of cybersecurity roles unfilled in the United States alone, yet many students and perhaps even educators, have not heard of careers such as a threat analyst or a security operations engineer. Lessons on cybersecurity can be done in all classes. Here are some examples that I have shared:
English: Analyze phishing emails as persuasive writing
With all of the technology, especially with AI and automation, we have to keep focused on what makes us uniquely human. Technology will continue to evolve, even faster than it has been. But empathy, integrity, resilience, and collaboration will always matter and we need to make sure that students develop these skills.
With career-connected learning opportunities, we will prepare students for success in the future, even in careers that don’t exist. We will offer opportunities for them to discover their interests and purpose and be prepared to embrace the changes they will encounter and be successful.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Throughout the country, states and districts are taking different approaches to student cell phone use. Some have implemented complete bans, while others are leaving the decision to individual schools or educators.
What I’ve learned over the past 12 years of using devices in my classroom is that while policies can help create structure, they don’t build consistent digital habits. Digital wellness has to be taught, modeled, practiced, and reflected upon.
Why tech habits matter
With so much access to technology, we need to guide students in developing good digital habits. Digital wellness involves helping students understand when technology is helpful, when it becomes draining, and how to make intentional choices that will keep them balanced and present. Cell phone bans and updated device policies have been designed to promote digital wellness in our schools.
I’ve observed that in schools with cell phone bans, students are more interactive with one another, and their socialization skills are improving. For some students, knowing where their phone is and having it close by is important, and I can relate. But I also understand the importance of disconnecting and being present in the moment, especially in our classrooms, to be more focused on learning.
I have done a variety of activities with students and educators focused on digital habits. In one of them, I focus on the “benefits” and “drains” of devices. A simple way to start is with activities that help students map their “digital day.” Ask them to list all the ways they use their phone or other devices from morning to night. Next, have them decide when the use helps learning (taking a photo of notes, defining or translating a word, keeping time, conducting research, or even recording a podcast draft) or benefits their well-being (such as tracking steps, doing meditation, or using focus apps). They then identify when it is draining (doomscrolling or game-playing; checking notifications; causing reduced energy, lack of attention, or mood changes).
Continue reading the rest of my article on Edutopia.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
In collaboration with Delightex Edu. All opinions are my own.
Over the past 9 years, using Delightex Edu (formerly CoSpaces Edu) with my students, I have seen it continually add features that spark curiosity, boost creativity, and offer more engaging ways for students to build their knowledge. I have often said that we need to move students from consumers to creators, to innovators, and with Delightex Edu, students don’t just consume content, they create immersive worlds. Students and educators can design 3D worlds, build interactive environments, and leverage all of the options for coding and creating a more authentic and personalized product.
Delightex Edu is a highly visual, user-friendly, intuitive system that helps students develop essential skills such as collaboration, creativity, logic, problem-solving, and more that will lead to future success. These are skills that have been in demand, and they are not changing, but what is changing is the “how” students can develop these and other essential future-ready skills.
Most recently, Delightex has added AI features to its already robust platform. Artificial intelligence is not a futuristic concept. I have been speaking about augmented and virtual reality and AI for more than eight years, and these concepts are not going away. They have become part of everyday life, shaping how we work, communicate, and create.
As digital literacy evolves, students need opportunities not just to use AI, but also to understand it, question it, and use and create with it responsibly. Delightex Edu’s latest update takes what it already offers to a new level. AI enhances the creative experience, expanding what students can build while engaging them in hands-on, safe, and exciting learning opportunities.
The new AI features focus on three essential principles: smarter creation, deeper learning, and safe innovation.
AI to amplify creation and not replace student creativity
One of the most important things that I have shared with students and educators is that having the new AI features should not be thought of as a substitute for students’ own thinking and creativity. Instead, it should amplify learning while also teaching students about AI’s capabilities in a safe space, which is what matters as we help them build content skills and AI literacy.
Students are still in control and taking the lead as they create and apply their knowledge in new ways. They are still the designers, the coders, the curious learners, and the storytellers. AI is just another tool in the Delightex toolbox. They now have more opportunities to learn about prompting, how to generate images they want, and be able to develop true AI literacy alongside computational thinking skills.
AI Buddies: Bringing Worlds to Life
Whether for students or educators, Delightex Edu is so much fun to dive into and start creating with, especially with AI Buddies, which are AI-powered 3D characters that can talk, react, and express emotions through real-time animations. AI Buddies are defined by creating a short prompt and can act as guides, tutors, narrators, or characters in a story. AI Buddies make it so much fun for anyone creating with Delightex.
AI Buddies are a fun addition to any project. They respond via text and can also use expressive animations that make interactions feel more natural and believable. Students can set proximity triggers in their environment so that an AI Buddy responds automatically when someone enters a specific area of a scene. This was a game-changer because it shifted the static environment into a more responsive and immersive experience.
When I think about the possibilities and how AI buddies will amplify learning, they can help students create more engaging stories, interactive simulations, and even role-based learning. Imagine having a historical figure who can speak to students. Or a science class or a language class, with a virtual guide who can walk users through a location unique to the content. Characters in a story can respond differently depending on the choices the player makes.
These possibilities also bring some reminders. Safety, especially when it comes to AI, is critical. With Delightex Edu, teachers control student access by license, class, or each individual student. Guardrails, Content Guard, and AI History ensure that any interactions stay age-appropriate, transparent, and are reviewable by the teacher.
AI Skills: Coding and AI Literacy
When AI Buddies are added to each student’s Project, it brings their story and their world to life. With AI Skills, students can decide how the characters will act.
AI Skills enables students to design actions using visual coding and assign them to AI Buddies. Using Delightex’s CoBlocks system, AI Skills combine traditional visual logic with the use of simple prompts. Students still define conditions, test behaviors, and refine outcomes as they have been able to do, but now with AI Skills, the characters can respond in more natural ways to dialogue and intent.
When learning to code, students were programming only event-based responses, for example, “when this happens, do that.” However, now, students think about how these intelligent systems are able to interpret meaning. It can lead to great conversations in the classroom, and students or teachers can talk about questions such as:
How does a character decide what action makes sense?
What happens when prompts are unclear?
How do logic and language work together?
AI-Generated 360° Worlds Inside 3D Scenes
One of my favorite new AI features is that I can dream big and create fun prompts that generate beautiful images. Through Delightex Edu’s Skybox integration, you can generate AI-powered 360° images right inside 3D scenes. Before this feature was added, scenes were limited, but now any 3D scene can be transformed into a fully immersive 360° environment, truly expanding creative possibilities. Students can instantly generate any backdrop they can imagine for their stories, simulations, or virtual field trips. Once they create their new background, they can select from all of the options for characters, objects, and more. It boosts student engagement and promotes more experiential learning.
Why This Is Important for the Future of Learning
As I explored these recent updates, I realized they are moving us toward what digital literacy should look like in an AI-powered world.
Whether early learners, older students, or educators, everyone needs opportunities to create with AI and understand its capabilities. And, they need to be able to do so in safe environments where experimentation is encouraged, guardrails are in place, and active learning is available. Delightex Edu is a platform where AI enhances creativity, deepens understanding of new technologies, supports the acquisition of content knowledge, and prepares students for future work and learning.
Always at the forefront with great features that bring amazing learning possibilities to students, I’m looking forward to more features from Delightex. And I am excited for all students who will be able to apply their knowledge in exciting and innovative ways!
To learn more and have fun creating, visit delightex.com/edu. Explore the gallery, check out the resources, and then start your own project! Have fun learning!
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
In my previous post, I focused on reflection. Thinking about it, if 2025 was a year of recalibration in education, the year ahead feels like it might shift toward a more intentional direction.
After slowing down, reflecting, and identifying what felt misaligned, educators now face an important decision. One is to thoughtfully consider what we continue with as we move forward. What should we keep because it makes sense and makes an impact? Or two, to decide whether there is something we need to leave behind so we can make an impact.
The future of education is not about moving faster, adopting more tools, or trying to keep up with all the changes, because that is not reasonable nor purposeful. And in full transparency, that is exactly what I thought years ago. After ongoing reflection, I now know that I should focus on how I can align and drive innovation with purpose, humanity, and care. Especially focusing on humanity.
As we look to the future and do our best to plan and prepare, several themes have emerged with greater clarity, at least in the experiences I have had. More focus on artificial intelligence, wearable technology, digital wellness, AI literacy, and a greater focus on student agency. Each of these generates opportunities to learn and continue to grow. Educators and students should engage in ongoing reflection, and, for educators, this requires asking better questions before making decisions about what is best for our classrooms.
Progress Without So Much Pressure
One of the greatest hopes I have for education now and in the future is that the progress made does not come at the expense of people. We need “humans in the loop,” as we have heard many times and will probably continue to hear. Schools are involved in so many initiatives that at times, it is absolutely exhausting. And that is for any educator, regardless of how long they have been in education.
Sometimes we invest our time and effort into an initiative, spending hours, days, weeks, only to have it disappear from the conversation either that same school year or in the not-too-distant future. The time we spend working on these initiatives takes us away from the truly impactful work that we could be doing instead. Initiatives are important and, in many instances, required; however, focusing on initiatives can lead to reactive decision-making and technology-first thinking rather than proactive decision-making, which negatively impacts what truly matters: our students and our own learning. The goal should not be to hesitate when it comes to innovation, but instead, to integrate intentionally, transparently, carefully, and responsibly.
AI in Education: From Capability to Responsibility
Artificial intelligence is the number one in-demand skill. Look at the World Economic Forum and the prediction for skills and jobs in demand, and you will see. AI will continue to shape education in many ways this year and in upcoming years, and of course, continue to evolve as technology advances. There are some things that I think about when considering AI and other technologies that are on the rise.
Sometimes I think that rather than thinking about what AI can do, maybe we should ask:
What should AI do?
When does AI support thinking, and when does it replace it?
How do we ensure AI is used ethically, transparently, and equitably?
A Few Predictions for AI in Education
AI should and hopefully will become more embedded in everyday tools rather than standing alone as an extra or an add-on, or time-consuming for educators and students to use.
Schools will shift from banning AI, and I hope to shift to supporting educators as they teach responsible use and attribution.
AI will support feedback, differentiation, and accessibility, especially for multilingual learners and students with disabilities or diverse learning needs.
There will be greater emphasis on process over product, requiring students to be more accountable for how they answer a question. And they will know why that matters.
My hope is that AI is, or will be, considered a thought partner, not a replacement for the work we do. I hope that educators feel empowered to shape its role in their work, rather than react to it, because that removes the opportunity for learning and growing.
Wearable Technology
Wearable technology is something that many people may not be aware of, yet it has become so common for some. Examples include smart watches, fitness trackers, and biometric tools, all of which will continue to be part of the conversations about learning, health, and attention. Many conversations that I have had around digital wellness have brought some of these technologies up, and educators are trying to determine whether they are draining or beneficial, or is it a mix?
While these tools offer potential insights into movement, focus, and well-being, the use of and reliance on them also raise important concerns about privacy, data ownership, and surveillance, which are serious concerns when it comes to emerging technology.
Predictions for Wearable Tech in Education
Increased discussion around student consent and data ethics.
Wearables are used more for self-awareness and personal growth tracking, which is beneficial.
Stronger guardrails are in place to protect any data that is collected
Greater integration and connections made with digital wellness initiatives rather than performance monitoring.
The goal should not be to track students and their habits without ongoing support, but instead to empower students to understand their attention, habits, and how they use and rely on technology in healthy ways.
AI Literacy: Fundamental, not supplemental
One of the most important goals for the year ahead is recognizing AI literacy as a fundamental, not a supplemental skill.
AI literacy is not just technical knowledge. It includes:
Understanding how AI systems work at a high level
Recognizing bias, limitations, and hallucinations
Knowing when AI is appropriate and when it isn’t
Practicing ethical use, attribution, and transparency
Developing critical thinking in AI-supported environments
Predictions for AI Literacy
AI literacy will begin appearing across disciplines—not just in computer science.
Educators will focus more on questioning, evaluating, and reflecting than on tool mastery.
Students will be asked to justify decisions made with AI support.
Schools will prioritize human skills such as judgment, empathy, and creativity, along with the development of technical fluency.
The goal should be that students will graduate and know how to use AI, and also how to think with discernment in an AI-shaped world.
Guiding Questions
As schools plan for the remainder of the 2025-2026 school year, the most important tool may not be a new platform or site, but rather some guiding questions to push reflective thinking.
When Evaluating AI Tools
Does this tool enhance or amplify learning or simply replace thinking?
How transparent is the AI about its limitations?
What skills do students still need to demonstrate independently, and that hold them accountable?
How are we teaching ethical use and attribution?
Considering Innovation
Does it(the tech) align with our values?
Does it support student well-being?
Does it simplify learning, or does it make it more complex?
Are educators provided with time and voice in its implementation? What about students?
These are just a few questions that I have considered, and I think can help shift decisions from being reactive to proactive and reflective.
As educators look ahead, reflection remains essential to our work and should not require educators to do more, but instead, guide us to focus on what matters most.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.