AI Literacy is Not Tool Mastery: How to Build Sustained Educator Capacity

Previous post on Getting Smart

Not long ago, artificial intelligence in education felt novel. It was something shiny, experimental, and, for many educators, possibly unsettling at times. When ChatGPT arrived in November 2022, the initial conversations and concerns were more focused on fear. I recall receiving emails, text messages, phone calls, and visits from educators who were concerned about cheating, plagiarism, lost skills, and what instantly felt like an overwhelming pace of change. It was something else to adjust to, not long after the overwhelming feeling that many felt in March of 2020. 

But since that initial adjustment to the increased use of AI in our world at the end of 2022 and through 2023, I’ve seen a shift happening. At first, there was skepticism, uncertainty, and hesitation, and not just in the world of education. However, as we’ve continued to adjust to new tools and new ways of working, I’ve noticed a shift from considering AI as a “what if” to the acceptance that AI is here and its use is increasing. It’s embedded in tools educators already use, and if it hasn’t already, then it will potentially slowly but surely become part of the daily routine and workflow of teaching and learning.

I’ve spoken about this shift from novelty to normalcy and how it brings a new challenge: educator upskilling.

A few years ago, I started researching the training available to educators and other professionals in AI. At the end of 2023, 87% of the educators in the United States had not received any training. In my workshops, some attendees are having their first training experience, more than 3 years after ChatGPT made its debut. So I think that we need to focus on an important question, whether in education or not. The question is no longer whether educators need professional learning around AI. Most people agree that they do. The bigger issue is whether we are approaching AI professional development in ways that are deep, sustained, and human-centered, or whether we’re still experiencing the one-and-done sessions that barely scratch the surface. With AI and the pace of change in education and the world, we need to do better and be prepared.

Shifting to Ongoing Capacity Building

When I completed my doctorate nearly two years ago, my research focused heavily on professional learning in emerging technologies, with a strong emphasis on AI. Even then, the message was clear. A single PD session, or even a series of short, tool-based trainings, was not enough, especially if completed early in the year or during a limited time span.

Yet, that is what I am learning about how AI PD is structured today. Through surveys in my sessions and conversations with other educators, there is a common experience happening, which is:

  • A 30-minute overview.
  • A 15-minute “certified educator” badge.
  • A walkthrough of one tool done well.

While these experiences can be helpful, especially for getting started and when time is limited, in the long term, they don’t build AI literacy. They build familiarity, whether with AI concepts or an AI tool. But familiarity is not AI literacy. Not for us as educators, nor for the students we are preparing for a future surrounded by AI and a world of work that seeks employees skilled in AI. 

Continue reading the original post on Getting Smart.

About Rachelle

Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!

Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.

She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.

Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.

Follow Rachelle on Bluesky, Instagram, and X at @Rdene915

**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks

************ Also, check out my THRIVEinEDU Podcast Here!

Join my show on THRIVEinEDU on Facebook. Join the group here.

A Closer Look at What’s New in Kira 2.0

In collaboration with Kira

During our ThriveinEDU livestream conversation about Kira, we explored a question that immediately resonated with educators:

What if planning, grading, and differentiation actually took half the time and still kept teachers in control of learning?

The question isn’t just about efficiency. It’s about sustainability and about supporting teachers to make instruction more responsive, more personalized, and more aligned to what students actually need in the moment, real-time responses, authentic feedback, and support from their teachers.

Kira recently released several new features (as part of their Kira 2.0 launch) that move beyond treating AI as a “lesson generator” or “assessment creator,” and it now works as a thought partner in the instructional workflow. After attending the Live Launch in New York on March 3rd and moderating the livestream, here are some of the biggest takeaways from the conversations that make the newest updates especially impactful for classrooms now.

Lesson/Course Studio

Many AI tools help teachers create one lesson at a time, which is highly beneficial and time-saving. But imagine you’re tasked with creating a course you’ve never taught or don’t have enough resources for. The amount of time needed is a bit overwhelming.

Kira’s Course and Lesson Studio helps educators generate both structured lessons and full, standards-aligned courses, including course outlines, unit sequences, lesson progressions, and assessments

Educators need to provide the topic, subject, grade level, and standards, and then, using this information or prompt, Kira builds the lesson with embedded formative checks already in place.

Formative assessment often happens after instruction, with Kira, teachers see student understanding during instruction.

As Rachel shared during the livestream:

“I don’t remember a time when I wasn’t taking work home or trying to get ahead of the game by planning out my week and then having to rewrite it midweek. It was so much work.”

Kira’s curriculum-building features help reduce that cycle in far less time. Rather than rewriting lessons to meet student needs, teachers start with a flexible structure they can adapt immediately, and, most importantly, stay in control. We are doing the editing, adjusting, and shaping of the lesson. This is an important distinction to make because it shows how crucial it is that teachers remain involved and review what has been generated.

Real-Time Insight Instead of End-of-Unit Surprises: Student Atlas

I have known about this for a few months and thought it was amazing. One of the most exciting updates in Kira 2.0 is Student Atlas, the platform’s student insight dashboard, now paired with Class Atlas, which brings those insights together at the class level.

Student Atlas provides:

  • concept-level mastery tracking
  • data confidence indicators
  • individual student support indicators
  • zones of proximal development insights
  • intervention suggestions

Rather than relying on a single quiz or test score, teachers can see which concepts students understand and where they’re struggling in real time. It enables us to see what concepts need reinforcing now, rather than waiting until the assessment is over and graded.

Class Atlas builds on this by turning individual insights into a clear, actionable class-wide view. Instead of opening 20+ student profiles and piecing things together, teachers can instantly answer: Where should I focus my instruction? and Which students need help with this skill? Teachers can even ask Kira to explain how it generated its recommendations, which helps schools as they look for tools and want to trust AI technologies.

Student Atlas also includes a data confidence indicator, helping educators assess the reliability of recommendations before making instructional decisions. That transparency supports professional judgment instead of replacing it.

Standards Alignment

Standards alignment is often one of the most time-consuming parts of planning, especially when building units or courses. And for educators teaching multiple courses, it is very time-consuming. But with Kira 2.0, that time requirement decreases because Kira 2.0 automatically tags lessons, activities, assessments, and questions to state standards, underlying skill progressions, and Bloom’s taxonomy levels.

Teachers can track how students are progressing through skills over time.

Supporting Multilingual Learners

Another standout feature we spoke about in the livestream is Kira’s built-in support for multilingual learners.

When gaps in understanding appear, Kira can generate:

  • scaffolded practice
  • targeted follow-up lessons
  • leveled reading supports
  • vocabulary scaffolds
  • translated instructional materials

Each of these supports is based on individual student performance, and not on a generic template that does not align with the student’s needs.

Differentiation is responsive rather than being reactive.

During the livestream, we talked about how, historically, differentiation required teachers to manually create multiple versions of lessons or assessments, which, of course, took a lot of time. With Kira, these supports are embedded directly inside the instructional workflow. Rachel said, “Especially talking about differentiation and the ease of it and being able to have the assistant nearby and go back and forth.”

Embedded support assists educators in providing what each student needs while giving them more time to work directly with each student.

Kira provides structure, but the teachers are the designers who provide the course’s vision.

Kira brings planning, assessment, differentiation, and student insight into one connected space. And when those pieces connect, teachers gain something incredibly valuable:

clarity
flexibility
time
and better visibility into learning

About Rachelle

Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!

Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.

She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.

Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.

Follow Rachelle on Bluesky, Instagram, and X at @Rdene915

**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks

************ Also, check out my THRIVEinEDU Podcast Here!

Join my show on THRIVEinEDU on Facebook. Join the group here.

Tool, Companion, or Supplemental Brain? What AI Will Be Depends on YOU!

Guest post by Robert W. Maloy and Torrey Trust

What are GenAI technologies, and what do we want them to become? Right now, GenAI is an educational chameleon, aggressively marketed as an indispensable learning companion, an academic partner, and a labor-saving tool; and at the same time, widely critiqued as a dangerous source of misinformation and biased responses, an environmental degrader, and a privacy invader. Since GenAI is all of these things and more, how do we use these tools appropriately and thoughtfully?

What GenAI is and what it will become depends on YOU – how you think about its roles, use it in your teaching and learning, and describe its functions to others.

Let’s look at two currently popular descriptions and uses of GenAI: 1) GenAI as a companion; 2) GenAI as a productivity-enhancing tool.

First, GenAI is widely described and used as a supportive “companion” or helpful “partner.” The Harvard Business Review (2025) reported that therapy/companionship was the number one way people were using GenAI in 2025. An alarming number of teens acknowledge that GenAI chatbots are their virtual companions, even though this technology can exploit youngsters’ emotional needs in ways that lead to self-harm and other risks (Common Sense Media report, Robb & Mann, 2025). One of the key problems here is that GenAI is NOT human, and it is not even intelligent (at least in the way humans perceive and describe intelligence).

The Key Takeaway: Using terms like “partner” or “companion” to describe GenAI technologies humanizes tools that are not designed to provide the support, guidance, and level of intelligence that actual humans can provide.

Second, GenAI technologies are widely presented as productivity-enhancing, time-saving, efficiency-increasing tools for people to use to improve their lives. “Use ChatGPT to make life easier,” declared a recent email advertisement, where all one had to do was “just tap a chat to start.” Personal and professional productivity is also one of the top ways people are using GenAI technologies – from writing emails and reports, to planning vacations and meals, to studying for exams; and it is certainly true that GenAI technologies can do all these things and so much more really fast. Yet, personal autonomy, creativity, and agency is lost when one uses GenAI technologies to automate activities they formerly did without it.

The key takeaway: Avoid talking about GenAI as automating work and think directly about how it can augment or supplement your activities as a teacher and a learner.

So If not a human-like companion or a productivity-enhancing automation tool, then how can we think about the role of GenAI in education? We believe that GenAI is best used when it augments teaching and learning, kind of like the way a caddie in golf enhances the golf experience. As such, we offer a metaphor of GenAI as a caddie; but again remind you that it is not an actual caddie and we are not trying to humanize this tool.

Professional golfers and their caddies on the LPGA, PGA, and more than 20 professional golf tours worldwide offer a metaphor for thinking about, describing, and using GenAI. Each pro golfer has a caddy who carries their clubs and walks alongside them when they play competitive tournaments. sharing ideas and information about the shots they are playing. For instance, until recently, LPGA player Brooke Henderson’s caddy was her older sister, Brittany; PGA player Xander Schauffele’s caddy is Austin Kaiser (his college golf teammate at San Diego State University).

Caddies have detailed information about the course and provide suggestions and feedback about what shots to hit with which clubs. They help keep track of the pace of play and how conditions of the course may be changing due to wind, weather, and time of day. However, it is the golfer who remains totally in charge of the outcomes of the game. Caddies do not hit the golf ball; golfers do not always do what the caddy suggests. It is the golfer who must make decisions, hit the shots, and deal with consequences, both positive and negative, in terms of performance and score. Caddies are there to augment the golf experience and outcome.

When it comes to teaching and learning, GenAI can be that source of information, ideas, or inspiration like a caddie; and it is the teacher who must determine what to do with that information. They have the expertise; they understand their classroom dynamics and contexts; they know their students, their topic, their grade level, and their community.

The key is for the teacher to resist the temptation to automate their work by turning it entirely over to a GenAI technology, because in this case GenAI is in control of the shots, rather than the teacher. It is as if professional golfers let their caddies choose the club and then hit the ball for them. This is even more problematic when it comes to using GenAI to automate tasks. In our metaphor, the caddie is a human who has expertise and has played golf before; however, GenAI is not a teacher, has never taught, and has no idea what teaching is. Turning over any tasks to a tool that does not have any expertise in education can become really problematic. Teachers must maintain agency and exert control, deciding when to accept, when to reject, and when to modify whatever ideas and information the GenAI provides.

So, returning to our original statement, what GenAI is and what it will become depends on YOU – how you think about its roles, use it in your teaching and learning, and describe its functions to others. What do YOU want GenAI to be?

If you’re looking for ways to use GenAI to augment teaching and learning, check our the free online companion of our new book: GenAI and Civic Engagement: 75+ Cross-Curricular Activities to Empower Your Students published by ISTE (International Society for Technology in Education) or explore the bonus learning plans we’ve published on this blog: Learning Plans for Supporting Student Agency in the Age of AI & Learning Plans for Exploring Civic Issues with GenAI.

Nearly 50 years ago, at the outset of the computer revolution in schools, Seymour Papert asked: Will computers program the child, or will educators create the conditions where children program computers? For Papert then, as for us today in the age of GenAI, using technology remains a question of human control and user agency. GenAI can provide amazing resources, but it is essential that you retain your decision-making and personal creativity. Only then will the results be truly yours.

Torrey Trust, Ph.D., is a Professor of Learning Technology in the College of Education at the University of Massachusetts Amherst. Her work centers on empowering educators and students to critically explore emerging technologies and make thoughtful, informed choices about their role in teaching and learning. Dr. Trust has received the University of Massachusetts Amherst Distinguished Teaching Award (2023), the College of Education Outstanding Teaching Award (2020), and the International Society for Technology in Education Making IT Happen Award (2018), which “honors outstanding educators and leaders who demonstrate extraordinary commitment, leadership, courage, and persistence in improving digital learning opportunities for students.” More recently, Dr. Trust has been a leading voice in exploring GenAI technologies in education and has been featured by several media outlets in articles and podcasts, including Educational Leadership, U.S. News & World Report, WIRED, Tech & Learning, The HILL, and EducationWeek. www.torreytrust.com

Robert W. Maloy is a senior lecturer in the College of Education at the University of Massachusetts Amherst, where he coordinates the history teacher education program and co-directs the TEAMS Tutoring Project, a community engagement/service learning initiative through which university students provide academic tutoring to culturally and linguistically diverse students in public schools throughout the Connecticut River Valley region of western Massachusetts. His research focuses on technology and educational change, teacher education, democratic teaching, and student learning. He is co-author of AI and Civic Engagement: 75+ Cross-Curricular Activities to Empower Your Students, Transforming Learning with New Technologies (4th edition); Kids Have All the Write Stuff: Revised and Updated for a Digital Age; Wiki Works: Teaching Web Research and Digital Literacy in History and Humanities Classrooms; We, the Students and Teachers: Teaching Democratically in the History and Social Studies Classroom; Ways of Writing with Young Kids: Teaching Creativity and Conventions Unconventionally; Kids Have All the Write Stuff: Inspiring Your Child to Put Pencil to Paper; The Essential Career Guide to Becoming a Middle and High School Teacher; Schools for an Information Age; and Partnerships for Improving Schools.

About Rachelle

Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!

Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.

She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.

Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.

Follow Rachelle on Bluesky, Instagram, and X at @Rdene915

**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks

************ Also, check out my THRIVEinEDU Podcast Here!

Join my show on THRIVEinEDU on Facebook. Join the group here.

From Awareness to Action: Responsible AI Adoption in Schools Now (Part 2)

In Part 1, I shared why understanding the legal landscape of artificial intelligence is essential as schools continue to explore how these tools can support teaching and learning. Schools everywhere are thinking through policies and how to best provide resources for educators, students, and families. Awareness of laws such as FERPA, COPPA, and GDPR, accessibility requirements, and concerns such as algorithmic bias and deepfakes set an important foundation for responsible implementation.

We need guidelines and guardrails. A common question I hear from educators and leaders after presenting sessions and workshops, or speaking at conferences, is: “What do we do next?”

Understanding the guardrails is only the first step. The real work begins when schools start building systems that support educators in applying this knowledge in practical, sustainable ways. And it requires true collaboration.

Responsible AI Adoption Is a Team Effort

One of the most important shifts happening right now is the recognition that AI adoption and policy development should not be the responsibility of a single person or a select few administrators or IT teams. Responsible implementation and policy development require collaboration across roles.

District leaders are shaping policy and expectations for the school community.

Technology teams are evaluating vendor compliance and infrastructure readiness. (I have a future post coming up about IT Teams and ongoing PD).

Instructional leaders are aligning tools with learning goals and supporting teachers with implementation.

Teachers are modeling and supporting ethical classroom use.

Students are exploring and developing AI literacy skills that will shape how they interact with technology throughout their lives.

What I truly believe is that when schools recognize AI is a shared responsibility rather than an isolated initiative, implementation becomes more intentional, reflective, and sustainable.

I consistently see this when working with districts across the country. The schools that are moving forward with confidence are not the ones adopting the most tools. They are the ones creating a community, developing a common language, and building shared understanding first.

Transparency Builds Confidence Across the Community

Another theme that has been coming up in conversations with educators and families is trust.

Families want and need to know:

What tools are being used?

What information is being collected?

How is student data protected?

How is AI, or any technology, being used in support of learning rather than replacing it?

Having clear answers to these questions helps to strengthen the essential partnerships between schools and families. It also creates opportunities for students to participate more actively in conversations about responsible technology use.

Transparency is not simply a compliance strategy. It is a relationship-building strategy. When schools communicate clearly and proactively, they reduce uncertainty and help communities better understand how innovation supports student success.

AI Literacy Is Now Part of Digital Citizenship

One of the biggest shifts happening in education right now is the expansion of digital citizenship to include AI literacy. We’ve been talking about media literacy, digital literacy, AI literacy, and even discernment. Our work is a bit more involved now, and we need to be prepared.

Students are already interacting with AI systems daily, both in and maybe more frequently outside of school. They need guidance, which means classrooms must play an essential role in helping students understand:

How to protect their personally identifiable information (PII)

How AI systems generate responses
How bias can appear in outputs
How misinformation spreads
How data is collected and used
How to evaluate whether a tool should be trusted

AI literacy is not about teaching students how to use a single platform. It is about helping them develop judgment.

When students learn how to ask better questions about technology, they become more confident learners and more thoughtful digital citizens. Emerging tools continue to shape how students research, communicate, and create, and as educators, we have to keep learning so we can guide them to use the tools available to them safely and successfully.

Accessibility and Equity at the Center

As schools explore AI tools, accessibility must be a part of every conversation.

AI has tremendous potential to support multilingual learners, provide personalized feedback, assist with reading and writing tasks, and help students access content in new ways. It has endless ways to support educators. Schools must continue evaluating whether tools meet accessibility expectations and support equitable learning experiences.

Responsible implementation means asking questions such as:

Does this tool improve students’ access?

Does it create barriers? There has been more talk about the digital divide recently.

Does it support multiple learning pathways?

Does it align with universal design principles? Or a Portrait of a Graduate or an AI-Ready graduate?

Technology should expand opportunity rather than narrow it.

Supporting Educators Through the Transition

One of the most encouraging things I have seen in my work with educators is their investment in learning and the desire to learn with and from their students.

Educators are exploring AI tools while also asking important questions about privacy, ethics, and instructional impact. This balance is exactly what responsible adoption should look like.

Professional learning plays an essential role.

Educators benefit from opportunities to:

Explore tools safely
Review privacy expectations
Understand policy implications
Design classroom strategies
Collaborate with colleagues
Develop shared language around responsible use

When professional learning includes both legal awareness and classroom application, educators feel more confident making decisions that support students. Confidence leads to stronger implementation. And this is the work I am most passionate about when working with schools.

Leadership Matters More Than Ever

School leaders are in a unique position to support responsible AI adoption by:

Developing clear expectations
Supporting cross-team collaboration
Communicating with families (consistently)
Reviewing vendor agreements carefully
Building a common language around the use of AI
Creating space for experimentation, but having guardrails in place

Moving Forward

Artificial intelligence is already part of the learning landscape. We should not be talking about whether schools should engage with AI, but rather deciding how they will engage with it.

When schools combine legal awareness, transparency, accessibility considerations, and strong professional learning structures, they create innovative environments built on human decision-making.

Students benefit when educators feel confident.

Educators benefit when leaders provide clarity.

Communities benefit when schools communicate openly.

Responsible AI adoption is about moving forward with purpose.

When schools take that approach and have a team to work with, they are preparing students to understand technology, question it, and be the ones who determine what comes next.

About Rachelle

Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!

Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.

She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.

Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.

Follow Rachelle on Bluesky, Instagram, and X at @Rdene915

**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks

************ Also, check out my THRIVEinEDU Podcast Here!

Join my show on THRIVEinEDU on Facebook. Join the group here.