Preparing Educators for an AI Future Means Preparing Leaders First
In my last article, I shared my thoughts about what I’ve been learning from working with district leadership teams across the country as they navigate questions about artificial intelligence, digital wellness, and purposeful technology use. My work has provided me with tremendous opportunities to learn from educators, students, and families.
Conversations about screen time, purposeful technology use, and digital balance are happening everywhere. What I’ve found most insightful is when students and educators have the chance to sit down and engage in open, honest conversations about these topics and learn from one another. I’ve noticed a common theme in most of these conversations. We have to focus on more than just the technology, especially when talking about AI use in schools. Frequently, the focus is first on specific tools. When talking about artificial intelligence happening in schools, the questions have been:
Which platform should we allow? What should students be permitted to use? What policies do we need?
These are important questions. But they are not the first questions schools should be asking.
The first question schools should be asking is:
How prepared are our educators to lead in an AI-shaped learning environment?
Successful implementation is not about technology adoption.
Introducing AI into classrooms is easy. Supporting educators to understand how to use it meaningfully is the real work. And with support comes confidence.
Educator readiness is the real implementation strategy
Across the districts I have worked with, I’ve noticed that the biggest predictor of successful AI integration is not the access to tools, but whether or not educators feel supported as they navigate the changes happening.
I believe that schools will see more progress and success when there are goals set. Educators must have time to explore. Expectations need to be communicated clearly and with a consistent message. Policies must be in place, and they should emphasize guidance rather than restriction. AI implementation and any technology integration succeed when educators understand not only how to use tools, but why they should use them, and what the impact is on student learning. This is what I am hearing from students around the country.
Across classrooms nationwide, students are using an increasing number of digital tools in their classes. However, I am hearing from them that they are not always consistently guided on how to use them safely, ethically, and responsibly. Students wanting clarity is a powerful insight. Students wanting more purposeful use of technology is an even more powerful insight. How can this happen?
By supporting educators, because it helps to then support students.
Leadership sets the tone
One of the most powerful influences on AI adoption, technology use, and the establishment of standards for communication and screen time in a school system is leadership modeling.
When administrators ask for feedback, communicate transparently, dive in to explore tools with teachers, and acknowledge uncertainty while providing direction, they create a safe environment for innovation. Leadership like this builds trust, and trust makes responsible implementation possible.
Preparing students means preparing adults first
Students will graduate and enter workplaces shaped by automation, intelligent systems, and evolving expectations around collaboration with technology. According to the World Economic Forum, technological literacy is #3 for 2030. #1 is AI and #2 is cybersecurity. Students are not the only ones preparing for that future. Educators need to be prepared so that our students are too.
Professional learning on AI is no longer an option. It is an essential part of instructional readiness. The schools making the most progress right now are engaging in conversations to build systems that help educators adapt confidently as change continues. And that may be the most important preparation strategy of all.
Supporting educators means strengthening entire school systems. This is one of the most important investments districts can make as they prepare students for an AI-shaped, AI-driven future.
Stay tuned for part 3 of this Leading Forward Series.
I help schools and other organizations (law firms, healthcare professionals, business owners) implement AI responsibly through policy guidance, professional learning, and classroom-ready strategies grounded in both instructional practice and legal insight.
My sessions focus on helping teams:
• understand what AI can and cannot do
• recognize responsible-use considerations
• build confidence using emerging tools
•align implementation with organizational priorities
If your school, district, or organization is beginning conversations or looking to dive in and learn more about AI policy, professional learning, or responsible implementation, I’d welcome the opportunity to support your next steps through leadership workshops, keynote sessions, or strategic planning partnerships.
Preparing people is what makes AI implementation successful.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
Subscribe to my ThriveinEDU newsletter to stay informed. (If you receive my newsletter, you may have read this, but just in case…here is part I)
Over the past eight months, I’ve had the opportunity to work with educators, school leaders, and district teams from twelve districts across the country as they navigate one of the biggest shifts education has experienced in decades: the arrival of artificial intelligence in everyday teaching and learning. This work is part of a national digital wellness and innovation initiative supporting districts as they develop responsible approaches to emerging technologies.
I work with a Task Force from each district to evaluate policies, create resources for families, and decide when and how to begin teaching students about AI, as well as how best to support educators. And some of these Task Forces include students and parents. We have had many conversations about digital wellness, digital citizenship, screentime, and, of course, AI.
The conversations about AI included shared concerns, questions, and challenges. However, what has stood out the most in these conversations with these schools is not fear. It’s curiosity.
In classrooms, teachers are asking thoughtful questions about how AI can support student thinking rather than replace it. Administrators are working to align emerging tools with existing priorities such as digital citizenship, academic integrity, and student wellness. District teams are exploring how policy can move beyond restriction toward responsible guidance. Some are even completely rewriting their policies to align with these changes and make sure that a common language is used.
Recently, my work has included:
• Supporting district digital wellness and AI implementation planning
• Leading professional learning sessions on responsible AI use
• Presenting on AI and the law for educators
• Visiting classrooms to observe how students are already interacting with AI tools
• Collaborating with leadership teams and developing next-step strategies for staff support
• Designing activities for administrators and educators to evaluate policies and effective AI use
One consistent theme continues to emerge:
Districts, educators, and students are ready to lead.
Educators are not waiting for perfect answers to the big AI questions. They are considering the best pedagogical practices for using AI that protect students while expanding opportunities.
The most successful districts I’m working with right now are focusing on three priorities:
Supporting educator confidence: They need clarity, examples, and time to explore.
Creating shared expectations for responsible use across classrooms and grade levels
Preparing students to think critically about AI-generated information.
Artificial intelligence isn’t just a technology conversation.
It’s a leadership conversation.
And I’m excited to continue working with and learning alongside school districts as they move forward with clarity, purpose, and a strong commitment to keeping human relationships at the center of innovation.
Providing the training
Artificial intelligence is changing expectations across nearly every profession. Schools are not the only organizations preparing for this shift.
In my work as an educator, attorney, and national presenter on responsible AI implementation, I support organizations as they explore how AI connects to decision-making, ethics, communication, and everyday professional practice.
I help schools and other organizations (law firms, healthcare professionals, business owners) implement AI responsibly through policy guidance, professional learning, and classroom-ready strategies grounded in both instructional practice and legal insight.
My sessions focus on helping teams:
• understand what AI can and cannot do
• recognize responsible-use considerations
• build confidence using emerging tools
•align implementation with organizational priorities
If your school, district, or organization is beginning conversations or looking to dive in and learn more about AI policy, professional learning, or responsible implementation, I’d welcome the opportunity to support your next steps through leadership workshops, keynote sessions, or strategic planning partnerships.
Preparing people is what makes AI implementation successful.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
Not long ago, artificial intelligence in education felt novel. It was something shiny, experimental, and, for many educators, possibly unsettling at times. When ChatGPT arrived in November 2022, the initial conversations and concerns were more focused on fear. I recall receiving emails, text messages, phone calls, and visits from educators who were concerned about cheating, plagiarism, lost skills, and what instantly felt like an overwhelming pace of change. It was something else to adjust to, not long after the overwhelming feeling that many felt in March of 2020.
But since that initial adjustment to the increased use of AI in our world at the end of 2022 and through 2023, I’ve seen a shift happening. At first, there was skepticism, uncertainty, and hesitation, and not just in the world of education. However, as we’ve continued to adjust to new tools and new ways of working, I’ve noticed a shift from considering AI as a “what if” to the acceptance that AI is here and its use is increasing. It’s embedded in tools educators already use, and if it hasn’t already, then it will potentially slowly but surely become part of the daily routine and workflow of teaching and learning.
I’ve spoken about this shift from novelty to normalcy and how it brings a new challenge: educator upskilling.
A few years ago, I started researching the training available to educators and other professionals in AI. At the end of 2023, 87% of the educators in the United States had not received any training. In my workshops, some attendees are having their first training experience, more than 3 years after ChatGPT made its debut. So I think that we need to focus on an important question, whether in education or not. The question is no longer whether educators need professional learning around AI. Most people agree that they do. The bigger issue is whether we are approaching AI professional development in ways that are deep, sustained, and human-centered, or whether we’re still experiencing the one-and-done sessions that barely scratch the surface. With AI and the pace of change in education and the world, we need to do better and be prepared.
Shifting to Ongoing Capacity Building
When I completed my doctorate nearly two years ago, my research focused heavily on professional learning in emerging technologies, with a strong emphasis on AI. Even then, the message was clear. A single PD session, or even a series of short, tool-based trainings, was not enough, especially if completed early in the year or during a limited time span.
Yet, that is what I am learning about how AI PD is structured today. Through surveys in my sessions and conversations with other educators, there is a common experience happening, which is:
A 30-minute overview.
A 15-minute “certified educator” badge.
A walkthrough of one tool done well.
While these experiences can be helpful, especially for getting started and when time is limited, in the long term, they don’t build AI literacy. They build familiarity, whether with AI concepts or an AI tool. But familiarity is not AI literacy. Not for us as educators, nor for the students we are preparing for a future surrounded by AI and a world of work that seeks employees skilled in AI.
Continue reading the original post on Getting Smart.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
What are GenAI technologies, and what do we want them to become? Right now, GenAI is an educational chameleon, aggressively marketed as an indispensable learning companion, an academic partner, and a labor-saving tool; and at the same time, widely critiqued as a dangerous source of misinformation and biased responses, an environmental degrader, and a privacy invader. Since GenAI is all of these things and more, how do we use these tools appropriately and thoughtfully?
What GenAI is and what it will become depends on YOU – how you think about its roles, use it in your teaching and learning, and describe its functions to others.
Let’s look at two currently popular descriptions and uses of GenAI: 1) GenAI as a companion; 2) GenAI as a productivity-enhancing tool.
First, GenAI is widely described and used as a supportive “companion” or helpful “partner.” The Harvard Business Review (2025) reported that therapy/companionship was the number one way people were using GenAI in 2025. An alarming number of teens acknowledge that GenAI chatbots are their virtual companions, even though this technology can exploit youngsters’ emotional needs in ways that lead to self-harm and other risks (Common Sense Media report, Robb & Mann, 2025). One of the key problems here is that GenAI is NOT human, and it is not even intelligent (at least in the way humans perceive and describe intelligence).
The Key Takeaway: Using terms like “partner” or “companion” to describe GenAI technologies humanizes tools that are not designed to provide the support, guidance, and level of intelligence that actual humans can provide.
Second, GenAI technologies are widely presented as productivity-enhancing, time-saving, efficiency-increasing tools for people to use to improve their lives. “Use ChatGPT to make life easier,” declared a recent email advertisement, where all one had to do was “just tap a chat to start.” Personal and professional productivity is also one of the top ways people are using GenAI technologies – from writing emails and reports, to planning vacations and meals, to studying for exams; and it is certainly true that GenAI technologies can do all these things and so much more really fast. Yet, personal autonomy, creativity, and agency is lost when one uses GenAI technologies to automate activities they formerly did without it.
The key takeaway: Avoid talking about GenAI as automating work and think directly about how it can augment or supplement your activities as a teacher and a learner.
So If not a human-like companion or a productivity-enhancing automation tool, then how can we think about the role of GenAI in education? We believe that GenAI is best used when it augments teaching and learning, kind of like the way a caddie in golf enhances the golf experience. As such, we offer a metaphor of GenAI as a caddie; but again remind you that it is not an actual caddie and we are not trying to humanize this tool.
Professional golfers and their caddies on the LPGA, PGA, and more than 20 professional golf tours worldwide offer a metaphor for thinking about, describing, and using GenAI. Each pro golfer has a caddy who carries their clubs and walks alongside them when they play competitive tournaments. sharing ideas and information about the shots they are playing. For instance, until recently, LPGA player Brooke Henderson’s caddy was her older sister, Brittany; PGA player Xander Schauffele’s caddy is Austin Kaiser (his college golf teammate at San Diego State University).
Caddies have detailed information about the course and provide suggestions and feedback about what shots to hit with which clubs. They help keep track of the pace of play and how conditions of the course may be changing due to wind, weather, and time of day. However, it is the golfer who remains totally in charge of the outcomes of the game. Caddies do not hit the golf ball; golfers do not always do what the caddy suggests. It is the golfer who must make decisions, hit the shots, and deal with consequences, both positive and negative, in terms of performance and score. Caddies are there to augment the golf experience and outcome.
When it comes to teaching and learning, GenAI can be that source of information, ideas, or inspiration like a caddie; and it is the teacher who must determine what to do with that information. They have the expertise; they understand their classroom dynamics and contexts; they know their students, their topic, their grade level, and their community.
The key is for the teacher to resist the temptation to automate their work by turning it entirely over to a GenAI technology, because in this case GenAI is in control of the shots, rather than the teacher. It is as if professional golfers let their caddies choose the club and then hit the ball for them. This is even more problematic when it comes to using GenAI to automate tasks. In our metaphor, the caddie is a human who has expertise and has played golf before; however, GenAI is not a teacher, has never taught, and has no idea what teaching is. Turning over any tasks to a tool that does not have any expertise in education can become really problematic. Teachers must maintain agency and exert control, deciding when to accept, when to reject, and when to modify whatever ideas and information the GenAI provides.
So, returning to our original statement, what GenAI is and what it will become depends on YOU – how you think about its roles, use it in your teaching and learning, and describe its functions to others. What do YOU want GenAI to be?
Nearly 50 years ago, at the outset of the computer revolution in schools, Seymour Papert asked: Will computers program the child, or will educators create the conditions where children program computers? For Papert then, as for us today in the age of GenAI, using technology remains a question of human control and user agency. GenAI can provide amazing resources, but it is essential that you retain your decision-making and personal creativity. Only then will the results be truly yours.
Torrey Trust, Ph.D., is a Professor of Learning Technology in the College of Education at the University of Massachusetts Amherst. Her work centers on empowering educators and students to critically explore emerging technologies and make thoughtful, informed choices about their role in teaching and learning. Dr. Trust has received the University of Massachusetts Amherst Distinguished Teaching Award (2023), the College of Education Outstanding Teaching Award (2020), and the International Society for Technology in Education Making IT Happen Award (2018), which “honors outstanding educators and leaders who demonstrate extraordinary commitment, leadership, courage, and persistence in improving digital learning opportunities for students.” More recently, Dr. Trust has been a leading voice in exploring GenAI technologies in education and has been featured by several media outlets in articles and podcasts, including Educational Leadership, U.S. News & World Report, WIRED, Tech & Learning, The HILL, and EducationWeek. www.torreytrust.com
Robert W. Maloy is a senior lecturer in the College of Education at the University of Massachusetts Amherst, where he coordinates the history teacher education program and co-directs the TEAMS Tutoring Project, a community engagement/service learning initiative through which university students provide academic tutoring to culturally and linguistically diverse students in public schools throughout the Connecticut River Valley region of western Massachusetts. His research focuses on technology and educational change, teacher education, democratic teaching, and student learning. He is co-author of AI and Civic Engagement: 75+ Cross-Curricular Activities to Empower Your Students, Transforming Learning with New Technologies (4th edition); Kids Have All the Write Stuff: Revised and Updated for a Digital Age; Wiki Works: Teaching Web Research and Digital Literacy in History and Humanities Classrooms; We, the Students and Teachers: Teaching Democratically in the History and Social Studies Classroom; Ways of Writing with Young Kids: Teaching Creativity and Conventions Unconventionally; Kids Have All the Write Stuff: Inspiring Your Child to Put Pencil to Paper; The Essential Career Guide to Becoming a Middle and High School Teacher; Schools for an Information Age; and Partnerships for Improving Schools.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
In Part 1, I shared why understanding the legal landscape of artificial intelligence is essential as schools continue to explore how these tools can support teaching and learning. Schools everywhere are thinking through policies and how to best provide resources for educators, students, and families. Awareness of laws such as FERPA, COPPA, and GDPR, accessibility requirements, and concerns such as algorithmic bias and deepfakes set an important foundation for responsible implementation.
We need guidelines and guardrails. A common question I hear from educators and leaders after presenting sessions and workshops, or speaking at conferences, is: “What do we do next?”
Understanding the guardrails is only the first step. The real work begins when schools start building systems that support educators in applying this knowledge in practical, sustainable ways. And it requires true collaboration.
Responsible AI Adoption Is a Team Effort
One of the most important shifts happening right now is the recognition that AI adoption and policy development should not be the responsibility of a single person or a select few administrators or IT teams. Responsible implementation and policy development require collaboration across roles.
District leaders are shaping policy and expectations for the school community.
Technology teams are evaluating vendor compliance and infrastructure readiness. (I have a future post coming up about IT Teams and ongoing PD).
Instructional leaders are aligning tools with learning goals and supporting teachers with implementation.
Teachers are modeling and supporting ethical classroom use.
Students are exploring and developing AI literacy skills that will shape how they interact with technology throughout their lives.
What I truly believe is that when schools recognize AI is a shared responsibility rather than an isolated initiative, implementation becomes more intentional, reflective, and sustainable.
I consistently see this when working with districts across the country. The schools that are moving forward with confidence are not the ones adopting the most tools. They are the ones creating a community, developing a common language, and building shared understanding first.
Transparency Builds Confidence Across the Community
Another theme that has been coming up in conversations with educators and families is trust.
Families want and need to know:
What tools are being used?
What information is being collected?
How is student data protected?
How is AI, or any technology, being used in support of learning rather than replacing it?
Having clear answers to these questions helps to strengthen the essential partnerships between schools and families. It also creates opportunities for students to participate more actively in conversations about responsible technology use.
Transparency is not simply a compliance strategy. It is a relationship-building strategy. When schools communicate clearly and proactively, they reduce uncertainty and help communities better understand how innovation supports student success.
AI Literacy Is Now Part of Digital Citizenship
One of the biggest shifts happening in education right now is the expansion of digital citizenship to include AI literacy. We’ve been talking about media literacy, digital literacy, AI literacy, and even discernment. Our work is a bit more involved now, and we need to be prepared.
Students are already interacting with AI systems daily, both in and maybe more frequently outside of school. They need guidance, which means classrooms must play an essential role in helping students understand:
How to protect their personally identifiable information (PII)
How AI systems generate responses How bias can appear in outputs How misinformation spreads How data is collected and used How to evaluate whether a tool should be trusted
AI literacy is not about teaching students how to use a single platform. It is about helping them develop judgment.
When students learn how to ask better questions about technology, they become more confident learners and more thoughtful digital citizens. Emerging tools continue to shape how students research, communicate, and create, and as educators, we have to keep learning so we can guide them to use the tools available to them safely and successfully.
Accessibility and Equity at the Center
As schools explore AI tools, accessibility must be a part of every conversation.
AI has tremendous potential to support multilingual learners, provide personalized feedback, assist with reading and writing tasks, and help students access content in new ways. It has endless ways to support educators. Schools must continue evaluating whether tools meet accessibility expectations and support equitable learning experiences.
Responsible implementation means asking questions such as:
Does this tool improve students’ access?
Does it create barriers? There has been more talk about the digital divide recently.
Does it support multiple learning pathways?
Does it align with universal design principles? Or a Portrait of a Graduate or an AI-Ready graduate?
Technology should expand opportunity rather than narrow it.
Supporting Educators Through the Transition
One of the most encouraging things I have seen in my work with educators is their investment in learning and the desire to learn with and from their students.
Educators are exploring AI tools while also asking important questions about privacy, ethics, and instructional impact. This balance is exactly what responsible adoption should look like.
Professional learning plays an essential role.
Educators benefit from opportunities to:
Explore tools safely Review privacy expectations Understand policy implications Design classroom strategies Collaborate with colleagues Develop shared language around responsible use
When professional learning includes both legal awareness and classroom application, educators feel more confident making decisions that support students. Confidence leads to stronger implementation. And this is the work I am most passionate about when working with schools.
Leadership Matters More Than Ever
School leaders are in a unique position to support responsible AI adoption by:
Developing clear expectations Supporting cross-team collaboration Communicating with families (consistently) Reviewing vendor agreements carefully Building a common language around the use of AI Creating space for experimentation, but having guardrails in place
Moving Forward
Artificial intelligence is already part of the learning landscape. We should not be talking about whether schools should engage with AI, but rather deciding how they will engage with it.
When schools combine legal awareness, transparency, accessibility considerations, and strong professional learning structures, they create innovative environments built on human decision-making.
Students benefit when educators feel confident.
Educators benefit when leaders provide clarity.
Communities benefit when schools communicate openly.
Responsible AI adoption is about moving forward with purpose.
When schools take that approach and have a team to work with, they are preparing students to understand technology, question it, and be the ones who determine what comes next.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Artificial intelligence (AI) is rapidly transforming education. From lesson planning support to personalized learning pathways and administrative efficiencies, AI tools are a more common part of everyday classroom practices. At the same time, the speed at which this technology has advanced and been adopted into classrooms has led to understandable uncertainty among educators, leaders, and families who are asking important questions. These groups are concerned with the data that is being collected, who owns AI-generated work, and what responsibilities schools have when students and educators use these tools.
As both an attorney and educator who has spent more than eight years researching, teaching, presenting, and writing about AI, I have worked with schools across K–12 and higher education that are navigating these exact questions. The legal implications of AI are not barriers to innovation, but I consider them to serve as guardrails that assist schools with adopting technology responsibly. The key is protecting students, educators, and institutions and staying informed. Understanding the legal landscape and any potential legal implications as a result of the use of AI in classrooms helps schools move forward with confidence rather than hesitation.
Why AI and the Law Matter in Education
AI relies on data in order to function effectively. When it comes to schools, this means having access to student information, classroom artifacts, writing samples, images, and even data related to physical or behavioral information. Intent is not the deciding factor. Even if educators believe they are only sharing minimal information, that does not clearly identify a student, family member, or colleague, even seemingly harmless details can qualify as personally identifiable information (PII).
I’ve often spoken about some examples like referencing a favorite restaurant, a local landmark, a pet’s name, or an extracurricular activity, all of which could make a student identifiable when combined with other data points. Last year, an educator in one of my sessions said, “Enough stars to still form a constellation,” and that has stuck with me and I have shared it in each AI and the Law session I have done. That is why evaluating tools carefully and teaching students to do the same are essential. I often reference scavenger hunts, in that educators should not feel like they are on a scavenger hunt when trying to find out what happens to their information. We need transparency from vendors so that educators are aware and informed.
AI is also changing how decisions are made in schools. With many advances, there are recommendation systems, automated feedback tools, and predictive analytics that can influence learning pathways, grading practices, and student support services. Having an understanding of how these systems work and how they should be used responsibly is becoming part of educators’ and school leaders’ professional responsibilities.
Key Laws That Shape AI Use in Schools
There are several important laws that guide how schools must approach AI.
FERPA (Family Educational Rights and Privacy Act) protects the privacy of student education records. When schools use AI-powered platforms that process student work or store learning data, they must ensure that these tools comply with FERPA requirements and clearly define how student information is handled.
COPPA (Children’s Online Privacy Protection Act) applies to students under the age of 13 and requires parental consent before collecting personal information through online services. Because many AI tools rely on user-generated input, COPPA compliance becomes especially important in elementary and middle school settings.
GDPR (General Data Protection Regulation), although it is a European Union law, is relevant to U.S. schools that use tools developed by companies that operate internationally. There are many platforms created outside of the United States that educators may be unaware of, and so understanding GDPR is essential. Many platforms now include cookie permissions and data-use customization features in response to GDPR requirements. These protections often benefit schools globally.
Schools should also consider state-level student data privacy laws, which are increasingly changing the expectations for vendor contracts, third-party integrations, and data retention timelines. District leaders and IT teams play an essential role in ensuring these requirements are addressed before tools are introduced into classrooms.
Data Privacy and Vendor Responsibility
AI tools require large amounts of data to function effectively. That data may be used to improve the tool itself, train additional models, or support integrations across connected platforms. Even when a tool states that it does not share user data, connected services or embedded features may still interact with stored information. I was asked two years ago, when speaking at LACOE in California during my AI and the Law session, if someone should “trust the platform when it says they do not share or store the data.” My instant answer was “No.” And it was for this exact reason.
Before introducing any AI platform in schools, educators and school leaders should review terms of service, privacy policies, and compliance documentation. Look for references to FERPA, COPPA, and additional privacy protections. Look for the date that the privacy policy was most recently updated. Districts should also confirm whether vendors use student information to train future AI models and whether contracts clearly define ownership and storage expectations.
This is where collaboration with district technology teams becomes essential. Responsible adoption is not an individual teacher’s decision. It is a system-level responsibility supported by leadership, policy teams, and instructional staff working together. Collaboration is key.
Transparency Builds Trust With Students and Families
Responsible AI adoption depends on communication. Families deserve clear explanations of the tools being used, the data being collected, and how that data is protected.
When working with students under age 13, written parental consent may be required. Even when it is not legally necessary, providing families with opportunities to ask questions strengthens trust and partnership. Transparency also empowers students. When students understand how AI systems work and the risks they may pose, they become more thoughtful digital citizens and more informed users of technology.
Schools that proactively communicate expectations for AI use are more likely to build families’ confidence and reduce misunderstandings about how these tools support learning.
Accessibility, Equity, and Emerging Legal Considerations
As schools adopt AI tools, accessibility and equity must remain part of the conversation. Laws such as Section 504 ofthe Rehabilitation Act and the Americans with Disabilities Act (ADA) require that digital learning tools be accessible to all students. If AI-powered platforms create barriers rather than support access, schools may face compliance concerns. We need to consistently audit the tools we are using. It must be an ongoing process.
Schools must also consider how AI intersects with Title IX responsibilities, especially with the rise of deepfake technology, which leads to new risks related to harassment and impacts student safety. Policies must be in place for addressing the misuse of generative AI tools and clearly define expectations and response procedures.
Algorithmic bias and fairness are important parts of the conversation. Schools should evaluate whether AI systems produce equitable outcomes across student groups and whether automated recommendations influence learning opportunities in unintended ways. Responsible implementation includes ongoing evaluation, not just initial approval.
Teaching Digital Citizenship With AI Literacy
Legal compliance alone is not enough. Students must also develop the skills needed to evaluate AI responsibly.
Developing skills in these areas means recognizing risks such as deepfakes and misinformation, bias in generated content, and cyberbullying that is supported by emerging technologies. Schools that integrate digital citizenship with AI literacy will guide students to become thoughtful participants in technology-rich environments rather than passive users who lack true understanding and AI literacy skills.
Clear expectations around appropriate use and academic integrity help students develop ethical decision-making skills that extend beyond the classroom.
Supporting Schools and Organizations Through AI and Legal Guidance
As AI adoption accelerates, schools will benefit from having a structured support system in place that connects legal awareness with thoughtful and purposeful classroom practice. Through my work with educators in K–12 and higher education, I provide professional learning experiences that help schools understand privacy requirements, implement responsible AI strategies, and align classroom applications with policy expectations.
My work includes keynote presentations, workshops, district leadership sessions, curriculum planning support, and customized training focused on data privacy, academic integrity, digital citizenship, accessibility considerations, vendor evaluation, and responsible AI adoption. Each training is tailored to address specific needs, ranging from introductory awareness sessions to deeper implementation planning and leadership strategy development.
In addition to supporting schools and universities, I work with organizations across other sectors to explore how to implement AI responsibly while remaining aligned with legal expectations and organizational values. Many industries face the same challenges that educators do, surrounding uncertainty about data privacy, questions about intellectual property ownership, concerns about transparency in decision-making systems, and the need to develop policies that support ethical innovation. My work helps organizations evaluate tools thoughtfully, identify potential risks early, and create practical guardrails that support responsible adoption rather than reactive compliance.
Organizations in healthcare, legal services, workforce development, nonprofit leadership, and corporate training environments are increasingly recognizing the importance of AI literacy for employees at every level. Through workshops, leadership sessions, and strategy conversations, I help teams understand how AI systems work, the legal considerations that may be applicable to them, and how to build cultures of responsible use that prioritize trust, security, and human judgment.
Moving Forward With Confidence
Artificial intelligence is already shaping how students learn, communicate, and prepare for future careers. The goal is not simply to adopt AI tools, but to adopt them responsibly. And this is where our work as educators comes in and why we need to dive in and learn with and guide our students.
When educators understand the legal landscape of privacy, accessibility, intellectual property, and ethical use, they can make informed decisions that support innovation and student protection. With thoughtful planning, collaboration, and transparency, schools will create learning environments where AI enhances opportunities while maintaining trust, safety, and integrity across the entire school community.
I work with schools and organizations, both in person and virtually, to support thoughtful and responsible AI implementation through professional learning, curriculum design, and resource development specific to educators, students, and families, using a common language. I have also collaborated with leadership teams to develop AI guidance frameworks, classroom-ready activities, and policies that reflect legal considerations.
The resources created help districts communicate clearly and consistently with families about AI use, support educators in building AI literacy, and provide students with age-appropriate strategies for using AI safely, ethically, and responsibly. By combining legal insight with classroom experience, I help schools move beyond uncertainty toward sustainable systems that include clear expectations, transparency, and actionable guardrails for responsible use.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Guest post by Dr. Torrey Trust and Dr. Robert Maloy
Welcome to “Students, Teachers, and Chatbots!” In this monthly series, you will find classroom-ready learning plans to use as you explore different civic engagement issues and topics with students. Each learning plan is connected to one of the ISTE (International Society for Technology in Education) Standards for Students.
Agency for learners means each individual is actively involved in what is happening educationally and instructionally in classrooms and schools. Agency, however, is more than paying attention in class, completing assignments on time, and earning high grades on tests. Agency also means students believe they have a voice and choice in what and how they are learning. They believe they can take actions in their lives based on what they are learning in schools.
In social studies education, agency is connected to civic education, and by extension, democratic teaching in democratic classrooms. Teaching about democracy is a cornerstone of civics education, where students learn the foundations of government of the people, for the people, by the people. Democracy offers everyone a voice and choice in making decisions collectively and collaboratively. In theory, the same is true in democratic classrooms. Yet, in the past three decades, the practice of democratic classrooms has faded from view. In school after school, standardized achievement exams have brought with them greater emphasis on teacher control and accountability, large group instruction, and teaching to the test (Ravitch, 2016).
In the current era of mandated curriculum frameworks and high-stakes testing, learning about democracy in many classes is focused on memorizing the branches and structures of national, state, and local government; reviewing the history of the American Revolution and other signature events in U.S. history; and learning the names of well-known historical figures. Democracy is rarely a lived experience for students.
When we asked college students, “What do you remember was your first experience with democracy?” many responded with puzzled expressions. When we clarify that by “first experience with democracy,” we mean when did they first recall thinking they had personal agency, that their voice mattered, that they were part of a collective decision-making process, most recall voting for the first time. But, when pressed to think back to when they were younger, some recall experiences with democracy in family meetings where adults and children shared ideas and made plans; at summer camps and recreation programs where campers had choices about playtime activities; in libraries where young readers made choices about what books to read; on sports teams where coaches let youngsters try many different positions and choose the ones they found engaged them the most. Those we spoke with so valued these experiences because they felt their choices mattered and decisions were respected, if not always agreed to by the adults in charge.
In the following bonus learning plan from our AI and Civic Engagement book, student agency is front and center – students are encouraged to research, design, and work together to create real change that is meaningful to them and their schools.
Chapter 9 (Global Collaborator)
Bonus AI-Enhanced Learning Plan: AI Literacy for All: Collaboratively Crafting an AI Curriculum for Your School
Student Engagement Question: How do you think we should be using AI in our classes and school?
AI technologies play a significant role in the lives of teachers, students, administrators, families, and community members everywhere. As the latest GenAI tools, models, and features are released, all of us are learning more and more about the possibilities and complexities of artificial intelligence and its place in education.
Elected officials and policymakers have ideas for what needs to be done for AI in education. The White House Office of Science & Technology under President Biden issued “A Blueprint for an AI Bill of Rights.” The European Union urged developers and users to ensure a safe, secure, and trustworthy AI. Lawmakers in Congress have introduced the AI Literacy Act, intended to address the reality that “communities most often negatively impacted by AI-enabled technologies often have the least access to AI education” (In section 2: Findings). One group of researchers from the National Education Policy Center has urged a pause in the use of AI tools in schools to give everyone time to develop guidelines and regulations about their use for in-person and online learning (Williamson, Molnar, & Boninger, 2024). Organizations, including Common Sense Media and OpenAI, are working together to create AI education guidelines (Kelly, 2024).
But, what do students think about the role of AI in their education? Should they have opportunities to use GenAI in every class, subject, and topic? Should they learn about the ethical issues surrounding the design and production of GenAI tools (e.g., hallucination, bias, environmental labor impact, exploitation of human labor, intellectual property rights)? Should they have opportunities to build their AI-Ready workforce skills?
This learning plan invites students to ensure their voice is heard when it comes to AI in their education. As global collaborators, they can work with others to develop an AI curriculum for their class, school, and/or district.
Learning GoalStudents will collaboratively draft an AI curriculum for their class, school, or district.
ACTIVITY 1: Research AI Curriculum Models and AI Literacy Frameworks/Models with GenAI
Invite students to curate a collection of AI curriculum frameworks, AI literacy frameworks and models, and any other resources and materials that can help them design an AI curriculum for their school or district. GenAI technologies can be a starting point for the research:
Example Prompt: “Create a table of at least 20 AI curriculum frameworks, AI literacy frameworks/models, or other sources to help me build an AI curriculum for my school. Make sure to include research-based frameworks and models. Include the name of the resource (column 1), a brief description of it (column 2), a description of why I should use it as a model or resource for my school’s AI curriculum (column 3), and a link to external sites to learn more information (column 4).”
Ask students to select at least 5 resources from their curation to critically examine and annotate, using the following AI-generated questions to guide their thinking:
What is the stated purpose or goal of this framework or resource?
Who created it, and what expertise or perspective do they bring (e.g., educators, technologists, policymakers, researchers)?
Missing Perspectives: Whose voices are missing from the authorship or the examples used? (e.g., Global South perspectives, Indigenous data sovereignty, non-corporate viewpoints).
What definitions of “artificial intelligence” or “AI literacy” does it rely on? How does this shape the rest of the resource?
What big ideas, concepts, or competencies does this resource emphasize that you think should appear in your school’s AI curriculum? Why?
What specific AI definitions, skills, or knowledge domains does this resource identify as essential? Which of these are non-negotiable for your specific student body?
Who is left out by this framework? Does it require expensive hardware, high-speed internet, or prior coding knowledge that your students may not possess?
How does the resource address ethical, societal, or environmental implications of AI? What elements of this should be included in your curriculum?
Does the resource treat AI as a standalone Computer Science subject, or does it offer strategies for integrating AI literacy into multiple subjects and classes?
What does this resource do exceptionally well? How does it contribute to an informed, balanced, or future-ready AI curriculum?
What is missing from the resource that is important for your school’s context (e.g., student diversity, local community needs, digital divide, civic engagement)?
How well does this resource align with your district’s mission, values, or current technology curriculum?
What adaptations would you make to this resource to ensure your curriculum is inclusive, engaging, and accessible to all learners, including multilingual learners and students with disabilities?
How does this resource compare to the other frameworks you selected? Where do they overlap or diverge?
Then, ask students to work in groups and design their own AI curriculum for their class, school, or the district.
ACTIVITY 2: Collaboratively Design an AI Curriculum with GenAI and School/Community
Ask students to use a collaborative technology to get feedback on their AI curriculum from family members, community members, and educational leaders.
They might do this by sharing their AI curriculum in a Google Doc with commenting features on and asking others to add their thoughts/ideas/suggestions/questions as comments throughout the document; or they could share a link to their AI curriculum document and provide a virtual space like Padlet or IdeaBoardz to collect feedback and ideas.
Then, have students, in their teams, review the feedback they received and make revisions to their AI curriculum.
Ask students to present their AI curriculum to the entire class and get feedback from their peers.
Then, as a class, vote on one curriculum (or multiple curriculums that can be merged into one) to send to the school leadership as an official proposal.
REFLECTION QUESTIONS
What role do you want AI to play in your schooling? Why?
Do you want AI to be taught as a standalone topic/class? Why or why not?
What learning opportunities do you need in school to confidently navigate the Age of AI?
AI LITERACY QUESTIONS
What are the arguments in favor of or against establishing an AI literacy or AI education graduation requirement for students at your school or in your state?
What AI ethical issues did you include in your curriculum? Why did you include those issues?
ISTE Global Collaborator Criteria Addressed:
1.7.b Multiple Viewpoints. Students use collaborative technologies to work with others, including peers, experts or community members, to examine issues and problems from multiple viewpoints.
1.7.c Project Teams. Students contribute constructively to project teams, assuming various roles and responsibilities to work effectively toward a common goal.
1.7.d Local and Global Issues. Students explore local and global issues, and use collaborative technologies to work with others to investigate solutions.
Torrey Trust, Ph.D., is a Professor of Learning Technology in the College of Education at the University of Massachusetts Amherst. Her work centers on empowering educators and students to critically explore emerging technologies and make thoughtful, informed choices about their role in teaching and learning. Dr. Trust has received the University of Massachusetts Amherst Distinguished Teaching Award (2023), the College of Education Outstanding Teaching Award (2020), and the International Society for Technology in Education Making IT Happen Award (2018), which “honors outstanding educators and leaders who demonstrate extraordinary commitment, leadership, courage, and persistence in improving digital learning opportunities for students.” More recently, Dr. Trust has been a leading voice in exploring GenAI technologies in education and has been featured by several media outlets in articles and podcasts, including Educational Leadership, U.S. News & World Report, WIRED, Tech & Learning, The HILL, and EducationWeek. www.torreytrust.com
Robert W. Maloy is a senior lecturer in the College of Education at the University of Massachusetts Amherst, where he coordinates the history teacher education program and co-directs the TEAMS Tutoring Project, a community engagement/service learning initiative through which university students provide academic tutoring to culturally and linguistically diverse students in public schools throughout the Connecticut River Valley region of western Massachusetts. His research focuses on technology and educational change, teacher education, democratic teaching, and student learning. He is co-author of AI and Civic Engagement: 75+ Cross-Curricular Activities to Empower Your Students, Transforming Learning with New Technologies (4th edition); Kids Have All the Write Stuff: Revised and Updated for a Digital Age; Wiki Works: Teaching Web Research and Digital Literacy in History and Humanities Classrooms; We, the Students and Teachers: Teaching Democratically in the History and Social Studies Classroom; Ways of Writing with Young Kids: Teaching Creativity and Conventions Unconventionally; Kids Have All the Write Stuff: Inspiring Your Child to Put Pencil to Paper; The Essential Career Guide to Becoming a Middle and High School Teacher; Schools for an Information Age; andPartnerships for Improving Schools.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Welcome to “Students, Teachers, and, Chatbots: Learning Plans for Exploring Civic Issues with GenAI!” In this monthly series, you will find classroom-ready learning plans to use as you explore different civic engagement issues and topics with students. Each learning plan is connected to one of the ISTE (International Society for Technology in Education) Standards for Students.
Imagine you have to vote in a school, local organization, community, state, or national election about a much debated and highly controversial issue. Someone proposes that instead of engaging in lengthy and potentially bitter debates, the group just let AI decide for them. What would be your response?
The question is no longer hypothetical. There are groups and government organizations in other countries that are turning over decisions about policies to AI chatbots. There is even a term for AI decision-making called “Algocracy” or government by algorithm.
Will chatbots make better decisions than elected political leaders or citizen voters? Many people now believe so. Across people in 35+ countries and speaking seven different languages, those surveyed were 30 percent more likely to see chatbots acting in their best interest and making better policy decisions on their behalf (Tech and Social Cohesion, 2025).
Letting chatbots make public policy decisions is known as “Algocracy” or “government by algorithm” (Thompson, 2022). The appeal of this idea is not hard to understand. People in country after country express distrust of politicians and political systems while also believing in the objectivity and efficiency of computer programs. Since chatbots are already proving they can make medical decisions at rates that can exceed those of human doctors, why wouldn’t chatbots do a better job of deciding where to spend money and allocate scarce resources?
Critics of algocracy are quick to point out that chatbots are not neutral tools. They function based on the datasets on which they have been trained, and that information has been shown to have alarmingly large amounts of misinformation and deep cultural, gender, racial, ability, and language biases (learn more).
Moreover, chatbots are “black boxes,” meaning users do not know how the systems actually make decisions. While how chatbots make decisions is invisible, the actions of elected representatives are matters of public record. Online and in print, you can research how your senator, representative, town or city council member, mayor, or other elected officials voted on the issues and you can write to them to express your views, for or against, their actions.
So what role, if any, should AI play in making decisions in democratic settings? Two former Google executives have proposed “rather than replace democracy with A.I., we must instead use A.I. to reinvigorate democracy, making it more responsive, more deliberative and more worthy of public trust” (Schmidt & Sorota, 2025, para. 3). This activity explores ways that AI can promote democracy and democratic decision-making while strengthening people’s participation in government and society.
Learning Goal
Students will build their civic knowledge by exploring the real world issue of Algocracy.
ACTIVITY 1: Using GenAI to Make Decisions for a Day (or an Hour)
Pick one day, one class, or one hour, and let GenAI make all the decisions for the class about what to do.
Example Prompt: “Respond yes or no and explain your reasoning for the following question from my 7th-grade students: Should we read Hamlet today or play Roblox?”
At the end of the day, class, or hour, invite students to reflect on their initial response to the student engagement question (“If a decision needs to be made, would you rather vote on it or have an AI chatbot decide?”) and whether they would change their response based on their experience asking GenAI to make decisions for them.
Then, have students research the concept of algocracy and current examples of AI decision-making by elected officials.
Finally, invite students to write a letter to their local town or state government in favor of, or in opposition to, this concept.
ACTIVITY 2: Critical Analysis of AI Decision-Making in Government
Invite students to research and then discuss the following questions:
How could the biases embedded in data shape political decision-making from AI systems?
How might AI-generated hallucinations affect governmental decision-making?
Who might benefit from AI decision making in government or an algocracy?
Who might be harmed from AI decision-making in government or an algocracy?
How might AI decision-making shift power dynamics within government? Who gains new forms of authority, and who loses it?
If an AI system makes an unjust or harmful decision, who should be held accountable (e.g., AI system developer? government officials?)
Who is more trustworthy? A politician or an AI system? Why?
Then, based on their research and discussion,
Reflection Questions
What role do you think AI systems will play in governmental decision-making 30 years from now? What about 100 years from now?
How might AI-driven governance shape or reshape democracy?
Would you vote for an AI candidate over a human candidate? Why or why not?
Could heavy reliance on AI governance discourage civic engagement or participation? Why or why not?
AI Literacy Questions
If you were to build an AI system to make decisions for the government, what data would you use to train the system? How would you reduce hallucinations? What safeguards would you put in place? What other ethical considerations would guide your design?
If GenAI systems can process far more information than humans, does that make it a better decision-maker? Why or why not?
ISTE Knowledge Constructor Criteria Addressed
1.3.a Effective Research Strategies. Students use effective research strategies to find resources that support their learning needs, personal interests, and creative pursuits.
1.3.b Evaluate Information. Students evaluate the accuracy, validity, bias, origin, and relevance of digital content.
1.3.d Explore Real-World Issues. Students build knowledge by exploring real-world issues and gain experience in applying their learning in authentic settings.
Assesses Switzerland’s efforts to build an ethical large language model for the public good, trained on only publicly available content.
Author Bios
Torrey Trust, Ph.D. is a Professor of Learning Technology in the College of Education at the University of Massachusetts, Amherst. Her work centers on empowering educators and students to critically explore emerging technologies and make thoughtful, informed choices about their role in teaching and learning. Dr. Trust has received the University of Massachusetts Amherst Distinguished Teaching Award (2023), the College of Education Outstanding Teaching Award (2020), and the International Society for Technology in Education Making IT Happen Award (2018), which “honors outstanding educators and leaders who demonstrate extraordinary commitment, leadership, courage, and persistence in improving digital learning opportunities for students.” More recently, Dr. Trust has been a leading voice in exploring GenAI technologies in education and has been featured by several media outlets in articles and podcasts, including Educational Leadership, U.S. News & World Report, WIRED, Tech & Learning, The HILL, and EducationWeek. http://www.torreytrust.com
Robert W. Maloy is a senior lecturer in the College of Education at the University of Massachusetts Amherst where he coordinates the history teacher education program and co-directs the TEAMS Tutoring Project, a community engagement/service learning initiative through which university students provide academic tutoring to culturally and linguistically diverse students in public schools throughout the Connecticut River Valley region of western Massachusetts. His research focuses on technology and educational change, teacher education, democratic teaching, and student learning. He is co-author of AI and Civic Engagement: 75+ Cross-Curricular Activities to Empower Your Students, Transforming Learning with New Technologies (4th edition); Kids Have All the Write Stuff: Revised and Updated for a Digital Age; Wiki Works: Teaching Web Research and Digital Literacy in History and Humanities Classrooms; We, the Students and Teachers: Teaching Democratically in the History and Social Studies Classroom; Ways of Writing with Young Kids: Teaching Creativity and Conventions Unconventionally; Kids Have All the Write Stuff: Inspiring Your Child to Put Pencil to Paper; The Essential Career Guide to Becoming a Middle and High School Teacher; Schools for an Information Age; andPartnerships for Improving Schools.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, Cybersecurity, STEM, AR/VR, and more for your school or speaking event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, Threads, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Many conversations have been happening focused on artificial intelligence, especially over the past three years since the launch of ChatGPT. There have been many new technologies developed and advancements in education and work as a result of AI-powered tools. And now, something else is becoming part of the conversation. Have you heard about “agentic AI”? When I have spoken about it, the response has been that it sounds abstract or highly technical, and for some, it even sounds scary. It has become another buzzword to add to the AI-related vocabulary. Agentic AI represents a shift in what AI can do, and for educators specifically, how it can support teaching and learning in ways that go beyond chatbots and text, audio, and image generation.
Whether you teach kindergarten or high school, whether you feel confident with AI or you are just starting to explore it, agentic AI is something you’ll want to understand. Not because it’s an evolving area, but because it is beginning to reshape how educators think about their workflow, student agency, and classroom productivity.
So what is it? Why does it matter? And how can we use it meaningfully in our practice?
What Is Agentic AI?
Agentic AI is different than the tools we have become used to and probably use frequently. Most of the AI tools, such as ChatGPT, Gemini, or Claude, are in the category of generative AI. You provide a prompt, and these LLMs or other tools produce a response. These tools can draft, summarize, translate, and brainstorm, but they only work step-by-step based on your input.
HowAgentic AI is different
Agentic AI refers to systems that can take on multi-step tasks, make autonomous decisions within given parameters, and carry out complex workflows with minimal human input. Rather than telling AI whattowrite, you tell an agent what you want to accomplish, and it decides and then takes the steps needed to get there.
I think of it like moving from having a powerful assistant to a collaborator who takes the initiative and digs into the research and the work.
Examples include AI that can:
Analyze student work, identify patterns, and suggest grouping strategies
Build a multi-week lesson that includes relevant standards, suggested pacing constraints, classroom goals, and more
Draft emails, create slides, and prepare communication resources like newsletters or infographics
Review data, generate insights, and highlight actionable next steps
Why Agentic AI Means for Education
The use of agentic AI, at least from my experience, has been about testing its capabilities, saving time, and becoming more efficient, which are beneficial for several reasons, but for one that I think is critical. The time saved can then be used to work with our students and colleagues, and to connect as only humans can.
Here are three ways that agentic AI can assist educators in our work
1. Automating the work that reduces our time with students
Teachers spend enormous amounts of time on administrative tasks and Agentic AI can reduce this load. An agent can help with scheduling, lesson ideas, generating resources for class instruction and more.
2. Supporting Differentiation and Personalization
Differentiation is important and it can take time to find the right ideas for every student. Agentic AI can analyze learning objectives, reading levels, standards, and classroom needs and then generate supports such as modified reading passages, tiered problem sets, alternative explanations for complex ideas, create sentence stems or vocabulary scaffolds, or suggest enrichment activities.
Rather than creating multiple versions of an assignment or assessment, teachers can leverage the agent to design or suggest differentiated materials and then use the time saved to support students more meaningfully.
3. Improving Digital Wellness Through Better Workflow
Digital wellness and balancing the use of tech are also common topics of discussion, especially with so much tech available. Agentic AI can support digital wellness when used purposefully. Instead of having students spend more time navigating apps, notifications, or endless digital distractions, an agent can streamline tasks and reduce digital overwhelm. Ask the agent to organize resources or create a structured plan based on a few ideas, then use the suggestions to build out a plan on your own.
Agentic AI Is Not…
Knowing what Agentic AI is and how it works is important. However, it is also important to understand what it is not.
Agentic AI is not:
A replacement for teachers
A grading automation system that removes human judgment
A tool that should work without guardrails
Something to hand to students without teaching digital citizenship and AI literacy
Instead, agentic AI should be a partner that is only used in combination with human oversight, reflection, and ethical boundaries.
This is where we, as educators play an essential role.
How to try Agentic AI today
Start with Your Workflow
Try an agent-based tool to:
Organize weekly lessons
Generate draft template emails (never include any personally identifiable information PII)
Build slide decks or provide bullet points for slides
Review data (remove PII) and summarize trends
I always suggest starting small. Think about one challenge or a “pain point” and then explore how an agent helps.
Use Agents for Planning and Support
Ask an AI agent to:
Create a standards-aligned sequence for a unit
Design project-based learning ideas
Suggest or generate differentiated materials
Identify vocabulary that students may struggle with
Always review carefully. Revise and personalize the outputs through your own experiences and specific needs.
Agentic AI is another change that we need to adjust to and maybe not fully embrace, but at least explore and understand what it is, how it works, and potential benefits or concerns. As with all technology, we have to keep everything focused on human-centered teaching, purposeful and intentional implementation, and setting clear boundaries.
If you have not yet tried agentic AI, take a few moments to see what it can do. I’d love to hear how it goes!
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Computer Science Education Week has been recognized in December each year. The timing selected to coincide with the birthday of Grace Hopper, a pioneer in computing. Every year during Computer Science Education Week, classrooms around the world plan activities to participate in the Hour of Code, to inspire everyone to explore the possibilities and opportunities available through coding. But this year the plans may be a little bit different. There has been a shift to focusing on the Hour of AI.
Over the past three years, AI has continued to advance and bring more tools into our classrooms and the world. There are so many possibilities available when it comes to AI and coding and the technology has continued to improve. Now, through a collaboration between Imagi Labs and Lovable, educators and students can dive into coding, without even writing a single line of code. It sounds impossible but it is true. Code is written by educators and students, simply by describing what they want. This is Vibe Coding. And the best part is that you don’t need to have a background in coding to be able to get started! My recent experience with Lovable and Imagi has shown how easy it is to build an app, create a game and more, by simply using natural language prompts. (Sign up to learn more during the Tuesday, December 9th webinarhere).
And when it comes to AI, there has been a valid concern around data privacy. With Imagi and Lovable, it is easy to get started without the need for sharing student data or involving a time-consuming and complex setup. Vibe coding and the resources available help to promote computer science and AI literacy in all classrooms and focus on healthy and intentional use of AI.
So What is Vibe Coding?
Vibe Coding is way to dive into coding without writing lines of code. Rather than writing out lines of code, you simply use words to describe the vibe of the program that you want to create and then AI helps to build it. Think about what happens with prompting. With vibe coding, you use natural language prompts to describe the kind of game or app you want to create, and then AI takes care of the task of generating the code. With my more recent experiences, I’ve explored Imagi and Lovable, which is an AI-powered platform that lets anyone (with or without coding experience) create websites, apps, and games by simply describing them.The focus of coding shifts to the wording and then the ideas turn into a working project. You spend time considering the concept, refining the descriptions, and iterating throughout the process.
I have used Imagi Labs for over a year and now, with the new learning experience via vibe coding, I have more ways to focus on Computer Science and AI literacy.Imagi has partnered with Lovable to make vibe coding more classroom-friendly and easier to get started. Through Imagi, educators have access to ready-made curriculum and a special school-safe mode for Lovable that does not require personal student accounts. So now all students can join in an Hour of AI activity safely and experience AI-driven coding, which educators can facilitate with more comfort and confidence.
Why Hour of AI and Vibe Coding?
The Hour of AI is an evolution of the Hour of Code, which I have participate in with my students for years. Initially I thought about it as just an hour, but the reality is that it is meant to be an hour that then inspires you to continue to bring coding and computer science opportunities into all classrooms. There is a growing need to build foundational AI literacy skills in addition to computer science skills, in order to prepare students for the future. Through these resources, whether Hour of Code or Hour of AI, the goal is to show students that anyone can explore AI and coding.
Vibe coding is the perfect activity to explore because it makes it even easier. I think about it like this: if you and your students can write a sentence, explain a concept, then you can start creating with code. Vibe coding does not require prior coding experience. Through Imagi and Lovable, there are tutorials that provide proof that anyone can learn to code and they can do so in a fun, AI-powered way. Commonly referred to as a plug-and-play, I think it is another great opportunity for the Hour of Code/AI season this year! And, to learn how to use it, join us for a great conversation and demo!
What I have always enjoyed during the Hour of Code activities or Computer Science Education Week activities, are the reactions of the students! Whether they build a game or just learn more about coding and become excited about the possibilities, it is always a great learning opportunity for them and for me too.
With opportunities to build and customize their own video game, it draws them right in. The specific project they’ll create is totally up to them, which sparks creativity and builds confidence and excitement in learning. What makes it even better is how students build it. Simply by typing their ideas in plain text, through a prompt, they end up with code that is quickly generated. For example, a student might start with a prompt like, “Create a game where a cat catches falling treats and earns points.” Lovable’s AI will take their prompt and generate an initial game which may have a cat sprite at the bottom of the screen that you can move, and treats dropping from the top. Students then test the game to see how it works and collaborate to improve it.
From there, the creative iteration kicks in. Maybe one student wants the game to be about space, not fruits. They just need to ask the AI to switch the theme. Typing in “Change it to a space game catching asteroids instead of treats.” Starting with games to have students catch items is a great way to get started and because students’ games can be adapted and relevant to any subject or story, the activity will help to engages their personal interests and connect meaningfully with classroom content. The AI takes care of the coding, but students remain the designers, guiding the outcome with their descriptions. And this is how we move them from consumers to creators and innovators!
This process also introduces the concept of prompt refining and debugging in a very digestible way, especially if they are limits in the number of prompts they can use. It requires them to really think through and be specific. Once generated, if the game doesn’t run exactly right on the first try, students then learn to tweak their description by adding more details. They may say to move an item faster or change the color to a lighter shade. Students work on debugging by having a conversation with the AI, which helps them to problem solve too. Students learn how to write prompts and debug creatively while building their game and it results in less frustration and instead sparks curiosity. Students can consider: What happens if I ask the AI to do this?How can I change the appearance of the characters or the background? for a few examples.
Students can publish or share their game, which they always enjoy! For some students, this may be the first time they’ve coded something playable, which is a huge confidence boost and hopefully the moment they realized that coding (and AI) can be creative, fun, and most importantly, something that everyone can do. And another benefit is the collaboration that happens. Want to join us and learn together? Sign up here for our livestream happening Tuesday!
Building AI Literacy and CS Skills
Beyond the excitement of making a game, vibe coding activities provide impactful instructional value. It aligns with traditional computer science foundations and emerging AI literacy standards. Lessons available have been mapped to AI Literacy competencies from the AILit framework, including skills to Evaluate, Create, and Design with AI.
Evaluate: Students practice critical thinking by examining what the AI produces and deciding if it’s acceptable or needs some tweaking. For example, if the AI’s first attempt has a bug or the theme is slightly off, students must decide whether to accept the result, refine their prompt, or start again. Students learn to question the AI output rather than trust it immediately, which is a key AI literacy skill they need to develop.
Create: Rather than simply playing and consuming a game, students can now collaborate with generative AI to create one. They continue to refine the results and reflect on how their prompts (their thought processes) lead to different outcomes It’s an easy way to introduce how human creativity and AI can work together, rather than have AI replace their thoughts. Students see that AI can assist their creativity, but that their own ideas and adjustments actually are behind the project.
Design: By the end, students are able to describe how an AI system like Lovable helped them to build a solution to a problem or project idea. They realize that they have designed a simple software product by leveraging AI and how AI tools might help solve problems in any field. I think this is a great way to engage students in a discussion in any subject or to focus on community issues. A focus on designing with AI for real-world contexts.
Using these tools, students are learning classic computer science concepts in an age-appropriate way. They understand algorithmic logic (the game has rules like “if the cat catches treats, the score increases”), and they practice testing and debugging (when their game doesn’t work as expected, they try again and iterate). The difference is that the AI handles the syntax and heavy coding, which allows students to focus on logic and the game design. It is truly empowering for younger learners and for any learner that may hesitate to try traditional coding. Now, they learn to code in a way that breaks down the challenges that may come from receiving coding errors.
Teacher Support
Trying a new tech tool in class can be time-consuming, but Imagi + Lovable make it easy to dive in. There are a variety of teacher supports available to help teachers feel prepared and confident, even if it’s the first time exploring AI and coding in the classroom. A few of the features:
Detailed Lesson Plan: A step-by-step lesson guide is provided, outlining the learning objectives, timing for each part of the activity, discussion questions, and potential student responses. It’s basically a script you can follow or adapt.
Slide Deck: There are ready-to-use slides designed for projecting in class while you run the Hour of AI. They introduce key concepts (like “What is AI?” and “What is vibe coding?”), show visual examples, and include prompt examples to guide students. There are also speaker notes.
Account Setup Is Simple: Imagi handles creating student accounts for Lovable with one click. The focus is on privacy-first (accounts are anonymous and expire after the event).
Troubleshooting Help: Technology is great until it isn’t. But for this, don’t worry because the Hour of AI pack includes a troubleshooting guide for common issues.
There are more supports available! –> Sign up here for our livestream happening Tuesday!
By participating in this event and exploring Vibe coding during the Hour of Code/AI, we are helping students build foundational AI literacy in an engaging way.
If you’ve been thinking about coding and AI, then Computer Science Education Week and the Hour of AI are the perfect time to dive in. Set aside an hour for vibe coding and see the impact when students see their ideas come to life.
Ready to get started? Join the webinar or sign up to get the recording and resources!
Let’s work on fostering creativity and building AI literacy for every student…one vibe at a time!
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, Cybersecurity, STEM, AR/VR, and more for your school or speaking event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, Threads, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.