Reflecting on Education in 2025

As I reflect on 2025, it feels like a year of recalibration. I think about education, and while things were definitely moving faster, especially with AI and the changes it has brought, I feel like things are moving at a deeper level. After several years of rapid changes, disruptions, and adjustments, many educators, leaders, and systems seem to have shifted from being reactive to proactive, and, more importantly, to focusing more on reflective practices. Some questions I consider are:

What is actually working?
What is overwhelming students and teachers?
What does “future-ready” really mean, and is it the proper term?

In many ways, 2025 feels more like a time when education stopped trying to keep up with every new trend, took a breath, and began reclaiming its intention.

From Urgency to Intention

The past few years have required schools to operate in what I’ve heard in many conversations as a “crisis mode.” After some thought, I have seen and experienced a shift away from an overwhelming sense of urgency to accomplish everything and toward purposeful decision-making. A word that I have used a lot after reading a book by Kevin Roose called Futureproof is “discernment.” He wrote about the shift from media and digital literacy to digital discernment. I’ve seen this in my own practice as well. Educators have become more discerning about initiatives to invest in, tools to explore, and expectations to set. The question “Can we do this?” shifted to “Should we do this? And “Why?” Which then led to the “How” part.

This shift showed up in conversations around curriculum, assessment, technology use, and student well-being. Schools began reducing or being more selective rather than layering, which helped educators to adjust better to change. Leaders focused more on coherence instead of compliance. And in some conversations I had or articles I read, I noticed respectful pushback on practices that added complexity without improving learning.

I think this is why the recalibration mattered.

AI Moved From Novelty to Normal

Since artificial intelligence and all of the new tools arrived in classrooms, it was an interesting time for educators. Something novel, something cool yet scary at times I’ve been told by educators that I am training, and other times, something to be avoided at all costs. But, what I noticed this year has been a shift. A shift away from the worries about plagiarism and cheating, about the time needed to learn how to leverage AI in our work, to a focus on how to bring it into our classrooms intentionally, purposefully, responsibly. In 2025, AI in education has become more of the norm.

I have noticed a change in the reactions. Now I see more focus on:

  • Data privacy
  • Ethical use and attribution
  • Age-appropriate access
  • Skill-building over shortcut-taking. (Leaning on versus learning from)
  • Transparency instead of surveillance

AI has become less about “cheating” and more about helping students and others learn how to think, evaluate, and create responsibly in an AI-infused world. Educators that I have worked with in my own school, at conferences and during professional development sessions that I have provided, have been asking different questions. At first, questions focused on “How can I tell when a student has used ChatGPT?” “Why do I need to teach about AI in the elementary level, they are too young and it is too much technology?” and “How do I find the time to evaluate the tools?” and more. But now, the questions are more targeted. Some examples are “How does this tool support learning goals?” and “When does it enhance or push thinking and when does it replace it?” Questions are also asked about how to connect AI into different grade levels and content areas without it feeling like something extra. I think the key to these questions is keeping the focus on the human aspect of learning and teaching.

We need to become AI literate and help students to develop their AI literacy skills, which do not only require developing technical skills. It also involves essentially human skills such as judgment, empathy, discernment, and reflection. With so much technology, the impact on us as humans is real and brings out the importance of digital wellbeing in addition to digital citizenship.

Digital Wellness

I’ve been working on an initiative through ISTE+ASCD and Pinterest that focuses on digital wellbeing and digital citizenship, both aligned with innovation. Something that I’ve noticed in the conversations at the schools is that educators are realizing that digital citizenship alone is not enough. Conversations about constant connectivity and the cost of it have been taking place and leading to new policies and guidelines in schools.

As a result, digital wellness has emerged as a priority for all, rather than as a standalone curriculum In my work with educators, each group talked openly about:

  • Attention fatigue
  • Notification overload
  • Screen balance
  • Emotional regulation
  • Boundaries and agency

Cellphone bans were in place and while some saw the positives, others raised some interesting points. Rather than banning technology outright or ignoring its impact, should we instead focus on intentional use of it and guide students? Questions like “When does technology add value?” and “When should we step away?” became part of the discussions in and out of the classroom.

Focusing on the human connection

I noticed in some schools that I visited, more socialization, more connections being made between students in the classrooms. More time for colleagues to work together and with their students.

There was renewed emphasis on:

  • Relationships over rigid pacing
  • Depth over coverage
  • Dialogue over compliance
  • Reflection over reaction

Administrators that I spoke with have said they are listening more closely and trust teachers to use their professional judgment. Something else I noticed was an increase in the inclusion of student voice in conversations about learning, technology, and school culture. I have asked students for feedback for many years and value their input as it expands my understanding and helps me to better connect. In some of the schools that I have visited, common questions to students have been:

  • How do you learn best?
  • What feels supportive vs. stressful?
  • How does technology affect your focus and well-being?
  • What do you want your teachers, families, friends, to understand about your experience?

When students were invited into these conversations, the results were powerful. They wanted agency, not avoidance. They wanted guidance on balance, which could not be learned through complete bans. When students were treated as collaborators or partners in shaping their learning environments, it led to powerful learning and growing as a school community.

The Power of Reflection

I wrote about it, spoke about it, and engaged in reflection myself and with other educators. We often noted the increase in the need for reflection, especially in our field that is constantly changing.

Some areas that I considered:

  • What I kept doing out of habit
  • What I needed to let go of for sustainability
  • What truly mattered in my classroom
  • What I needed to do to make a difference
  • Was I involving my students in decisions
  • What kind of educator I wanted to be

Reflection shouldn’t be about perfection, at least not in my mind. I see it as a way to focus on continued growth, clarity, and purpose in my work.

Some things that I learned in 2025

I had many opportunities to learn and share my learning with others. I provided some keynotes and a lot of training and working with educators from around the world. When I tried to gather my thoughts about innovation, effective technology use, digital wellness, student voice and agency, and reflection, I came to some conclusions…at least as of today. But I will continue to reflect.

  • Innovation without intention leads to exhaustion.
  • Technology must serve learning, not dominate it.
  • Wellness is foundational to continued growth.
  • Students are capable of thoughtful insight when involved in the conversation.
  • Reflection is a powerful driver of meaningful change.

Education does not become easier with each passing year but I do find that the conversations become more transparent and honest.

As I close the year on blogs for 2025, I leave you with some questions to consider, that I have considered myself:

Looking Back

  1. What was one moment in 2025 when teaching or learning felt especially meaningful for you? Why?
  2. What was draining or unsustainable this year as opposed to other years?
  3. What practice, tool, or expectation did you decide to let go of, and why?

Technology & Learning
4. Where did technology genuinely support learning this year? How?
5. Where did it lead to distraction, add extra pressure, or increase overload?

Well-being
6. When did you feel most balanced as an educator this year? What contributed to your balance?
7. What helped you to decide when something was “too much”?

You might even choose to engage in conversations with colleagues or a PLC for even more opportunities to learn and connect.

As we move forward into 2026, we must continue to design learning experiences that are human-centered, values-driven, and always reflective. If 2025 could offer advice, it might be to Slow down in order to choose well.

About Rachelle

Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!

Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.

She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.

Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.

Follow Rachelle on Bluesky, Instagram, and X at @Rdene915

**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks

************ Also, check out my THRIVEinEDU Podcast Here!

Join my show on THRIVEinEDU on Facebook. Join the group here.

Students, Teachers, and Chatbots:Learning Plans for Exploring Civic Issues with GenAI

Robert Maloy

Torrey Trust

Welcome to “Students, Teachers, and, Chatbots: Learning Plans for Exploring Civic Issues with GenAI!” In this monthly series, you will find classroom-ready learning plans to use as you explore different civic engagement issues and topics with students. Each learning plan is connected to one of the ISTE (International Society for Technology in Education) Standards for Students.

You can find more of these learning plans in our free online companion for our new book, AI and Civic Engagement: 75+ Cross-Curricular Activities to Empower Your Students. We hope you will find these plans engaging, and we welcome your ideas and suggestions.

AI-Enhanced Learning Plan: Democracy vs Algocracy

Imagine you have to vote in a school, local organization, community, state, or national election about a much debated and highly controversial issue. Someone proposes that instead of engaging in lengthy and potentially bitter debates, the group just let AI decide for them. What would be your response?

The question is no longer hypothetical. There are groups and government organizations in other countries that are turning over decisions about policies to AI chatbots. There is even a term for AI decision-making called “Algocracy” or government by algorithm.

Will chatbots make better decisions than elected political leaders or citizen voters? Many people now believe so. Across people in 35+ countries and speaking seven different languages, those surveyed were 30 percent more likely to see chatbots acting in their best interest and making better policy decisions on their behalf (Tech and Social Cohesion, 2025).

Letting chatbots make public policy decisions is known as “Algocracy” or “government by algorithm” (Thompson, 2022). The appeal of this idea is not hard to understand. People in country after country express distrust of politicians and political systems while also believing in the objectivity and efficiency of computer programs. Since chatbots are already proving they can make medical decisions at rates that can exceed those of human doctors, why wouldn’t chatbots do a better job of deciding where to spend money and allocate scarce resources?

Critics of algocracy are quick to point out that chatbots are not neutral tools. They function based on the datasets on which they have been trained, and that information has been shown to have alarmingly large amounts of misinformation and deep cultural, gender, racial, ability, and language biases (learn more).

Moreover, chatbots are “black boxes,” meaning users do not know how the systems actually make decisions. While how chatbots make decisions is invisible, the actions of elected representatives are matters of public record. Online and in print, you can research how your senator, representative, town or city council member, mayor, or other elected officials voted on the issues and you can write to them to express your views, for or against, their actions.

So what role, if any, should AI play in making decisions in democratic settings? Two former Google executives have proposed “rather than replace democracy with A.I., we must instead use A.I. to reinvigorate democracy, making it more responsive, more deliberative and more worthy of public trust” (Schmidt & Sorota, 2025, para. 3). This activity explores ways that AI can promote democracy and democratic decision-making while strengthening people’s participation in government and society.

Learning Goal

Students will build their civic knowledge by exploring the real world issue of Algocracy.

  • ACTIVITY 1: Using GenAI to Make Decisions for a Day (or an Hour)
    • Pick one day, one class, or one hour, and let GenAI make all the decisions for the class about what to do.
      • Example Prompt: “Respond yes or no and explain your reasoning for the following question from my 7th-grade students: Should we read Hamlet today or play Roblox?”
    • At the end of the day, class, or hour, invite students to reflect on their initial response to the student engagement question (“If a decision needs to be made, would you rather vote on it or have an AI chatbot decide?”) and whether they would change their response based on their experience asking GenAI to make decisions for them.
    • Then, have students research the concept of algocracy and current examples of AI decision-making by elected officials.
    • Finally, invite students to write a letter to their local town or state government in favor of, or in opposition to, this concept.
  • ACTIVITY 2: Critical Analysis of AI Decision-Making in Government
    • Invite students to research and then discuss the following questions:
      • How could the biases embedded in data shape political decision-making from AI systems?
      • How might AI-generated hallucinations affect governmental decision-making?
      • Who might benefit from AI decision making in government or an algocracy?
      • Who might be harmed from AI decision-making in government or an algocracy?
      • How might AI decision-making shift power dynamics within government? Who gains new forms of authority, and who loses it?
      • If an AI system makes an unjust or harmful decision, who should be held accountable (e.g., AI system developer? government officials?)
      • Who is more trustworthy? A politician or an AI system? Why?
    • Then, based on their research and discussion,

Reflection Questions

  • What role do you think AI systems will play in governmental decision-making 30 years from now? What about 100 years from now?
  • How might AI-driven governance shape or reshape democracy?
  • Would you vote for an AI candidate over a human candidate? Why or why not?
  • Could heavy reliance on AI governance discourage civic engagement or participation? Why or why not?

AI Literacy Questions

  • If you were to build an AI system to make decisions for the government, what data would you use to train the system? How would you reduce hallucinations? What safeguards would you put in place? What other ethical considerations would guide your design?
  • If GenAI systems can process far more information than humans, does that make it a better decision-maker? Why or why not?

ISTE Knowledge Constructor Criteria Addressed

  • 1.3.a Effective Research Strategies. Students use effective research strategies to find resources that support their learning needs, personal interests, and creative pursuits.
  • 1.3.b Evaluate Information. Students evaluate the accuracy, validity, bias, origin, and relevance of digital content.
  • 1.3.d Explore Real-World Issues. Students build knowledge by exploring real-world issues and gain experience in applying their learning in authentic settings.

References

Citizens.IS. (2025). Better Reykjavik. https://www.citizens.is/portfolio_page/better_reykjavik/

National Council of State Legislatures. (2022, January 4). Initiative and Referendum Overview and Resources. https://www.ncsl.org/elections-and-campaigns/initiative-and-referendum-overview-and-resources

Nwanevu, O. (2025). The right of the people: Democracy and the case for a new American founding. Penguin Random House.

Schmidt, E. & Sorota, A. (2025, November 16). This is no way to rule a country. The New York Times Sunday Opinion, p. 4).

Schofield, M. (2025, November 27). Ten ballot questions clear key hurdles. Greenfield Recorder, pp. A1, A10.

Tech and Social Cohesion. (2025, September 13). More people trust chatbots than elected leaders. https://techandsocialcohesion.substack.com/p/more-people-trust-chatbots-than-elected

Thompson, J. (2022, November 28). Algocracy would replace politicians with algorithms. Should we try it? Big Think. https://bigthink.com/thinking/algocracy-algorithm-government/

Resources

Apertus Isn’t (yet), the Win You Think It Is. Maxime Grenu. LinkedIn (September 2, 2025).

  • Assesses Switzerland’s efforts to build an ethical large language model for the public good, trained on only publicly available content.

Author Bios

Torrey Trust, Ph.D. is a Professor of Learning Technology in the College of Education at the University of Massachusetts, Amherst. Her work centers on empowering educators and students to critically explore emerging technologies and make thoughtful, informed choices about their role in teaching and learning. Dr. Trust has received the University of Massachusetts Amherst Distinguished Teaching Award (2023), the College of Education Outstanding Teaching Award (2020), and the International Society for Technology in Education Making IT Happen Award (2018), which “honors outstanding educators and leaders who demonstrate extraordinary commitment, leadership, courage, and persistence in improving digital learning opportunities for students.” More recently, Dr. Trust has been a leading voice in exploring GenAI technologies in education and has been featured by several media outlets in articles and podcasts, including Educational Leadership, U.S. News & World Report, WIRED, Tech & Learning, The HILL, and EducationWeek. http://www.torreytrust.com

Robert W. Maloy is a senior lecturer in the College of Education at the University of Massachusetts Amherst where he coordinates the history teacher education program and co-directs the TEAMS Tutoring Project, a community engagement/service learning initiative through which university students provide academic tutoring to culturally and linguistically diverse students in public schools throughout the Connecticut River Valley region of western Massachusetts. His research focuses on technology and educational change, teacher education, democratic teaching, and student learning. He is co-author of AI and Civic Engagement: 75+ Cross-Curricular Activities to Empower Your Students, Transforming Learning with New Technologies (4th edition); Kids Have All the Write Stuff: Revised and Updated for a Digital Age; Wiki Works: Teaching Web Research and Digital Literacy in History and Humanities Classrooms; We, the Students and Teachers: Teaching Democratically in the History and Social Studies Classroom; Ways of Writing with Young Kids: Teaching Creativity and Conventions Unconventionally; Kids Have All the Write Stuff: Inspiring Your Child to Put Pencil to Paper; The Essential Career Guide to Becoming a Middle and High School Teacher; Schools for an Information Age; andPartnerships for Improving Schools.

About Rachelle

Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!

Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.

She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.

Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, Cybersecurity, STEM, AR/VR, and more for your school or speaking event! Submit the Contact Form.

Follow Rachelle on Bluesky, Instagram, Threads, and X at @Rdene915

**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks

************ Also, check out my THRIVEinEDU Podcast Here!

Join my show on THRIVEinEDU on Facebook. Join the group here.

Agentic AI: What Educators Need to Know

Many conversations have been happening focused on artificial intelligence, especially over the past three years since the launch of ChatGPT. There have been many new technologies developed and advancements in education and work as a result of AI-powered tools. And now, something else is becoming part of the conversation. Have you heard about “agentic AI”? When I have spoken about it, the response has been that it sounds abstract or highly technical, and for some, it even sounds scary. It has become another buzzword to add to the AI-related vocabulary. Agentic AI represents a shift in what AI can do, and for educators specifically, how it can support teaching and learning in ways that go beyond chatbots and text, audio, and image generation.

Whether you teach kindergarten or high school, whether you feel confident with AI or you are just starting to explore it, agentic AI is something you’ll want to understand. Not because it’s an evolving area, but because it is beginning to reshape how educators think about their workflow, student agency, and classroom productivity.

So what is it? Why does it matter? And how can we use it meaningfully in our practice?

What Is Agentic AI?

Agentic AI is different than the tools we have become used to and probably use frequently. Most of the AI tools, such as ChatGPT, Gemini, or Claude, are in the category of generative AI. You provide a prompt, and these LLMs or other tools produce a response. These tools can draft, summarize, translate, and brainstorm, but they only work step-by-step based on your input.

How Agentic AI is different

Agentic AI refers to systems that can take on multi-step tasks, make autonomous decisions within given parameters, and carry out complex workflows with minimal human input. Rather than telling AI what to write, you tell an agent what you want to accomplish, and it decides and then takes the steps needed to get there.

I think of it like moving from having a powerful assistant to a collaborator who takes the initiative and digs into the research and the work.

Examples include AI that can:

  • Analyze student work, identify patterns, and suggest grouping strategies
  • Build a multi-week lesson that includes relevant standards, suggested pacing constraints, classroom goals, and more
  • Draft emails, create slides, and prepare communication resources like newsletters or infographics
  • Review data, generate insights, and highlight actionable next steps

Why Agentic AI Means for Education

The use of agentic AI, at least from my experience, has been about testing its capabilities, saving time, and becoming more efficient, which are beneficial for several reasons, but for one that I think is critical. The time saved can then be used to work with our students and colleagues, and to connect as only humans can.

Here are three ways that agentic AI can assist educators in our work

1. Automating the work that reduces our time with students

Teachers spend enormous amounts of time on administrative tasks and Agentic AI can reduce this load. An agent can help with scheduling, lesson ideas, generating resources for class instruction and more.

2. Supporting Differentiation and Personalization

Differentiation is important and it can take time to find the right ideas for every student. Agentic AI can analyze learning objectives, reading levels, standards, and classroom needs and then generate supports such as modified reading passages, tiered problem sets, alternative explanations for complex ideas, create sentence stems or vocabulary scaffolds, or suggest enrichment activities.

Rather than creating multiple versions of an assignment or assessment, teachers can leverage the agent to design or suggest differentiated materials and then use the time saved to support students more meaningfully.

3. Improving Digital Wellness Through Better Workflow

Digital wellness and balancing the use of tech are also common topics of discussion, especially with so much tech available. Agentic AI can support digital wellness when used purposefully. Instead of having students spend more time navigating apps, notifications, or endless digital distractions, an agent can streamline tasks and reduce digital overwhelm. Ask the agent to organize resources or create a structured plan based on a few ideas, then use the suggestions to build out a plan on your own.

Agentic AI Is Not

Knowing what Agentic AI is and how it works is important. However, it is also important to understand what it is not.

Agentic AI is not:

  • A replacement for teachers
  • A grading automation system that removes human judgment
  • A tool that should work without guardrails
  • Something to hand to students without teaching digital citizenship and AI literacy

Instead, agentic AI should be a partner that is only used in combination with human oversight, reflection, and ethical boundaries.

This is where we, as educators play an essential role.

How to try Agentic AI today

Start with Your Workflow

Try an agent-based tool to:

  • Organize weekly lessons
  • Generate draft template emails (never include any personally identifiable information PII)
  • Build slide decks or provide bullet points for slides
  • Review data (remove PII) and summarize trends

I always suggest starting small. Think about one challenge or a “pain point” and then explore how an agent helps.

Use Agents for Planning and Support

Ask an AI agent to:

  • Create a standards-aligned sequence for a unit
  • Design project-based learning ideas
  • Suggest or generate differentiated materials
  • Identify vocabulary that students may struggle with

Always review carefully. Revise and personalize the outputs through your own experiences and specific needs.

Agentic AI is another change that we need to adjust to and maybe not fully embrace, but at least explore and understand what it is, how it works, and potential benefits or concerns. As with all technology, we have to keep everything focused on human-centered teaching, purposeful and intentional implementation, and setting clear boundaries.

If you have not yet tried agentic AI, take a few moments to see what it can do. I’d love to hear how it goes!

About Rachelle

Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!

Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.

She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.

Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.

Follow Rachelle on Bluesky, Instagram, and X at @Rdene915

**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks

************ Also, check out my THRIVEinEDU Podcast Here!

Join my show on THRIVEinEDU on Facebook. Join the group here.

Vibe Coding and an Hour of AI Adventure for AllClassrooms

A New Twist on the Hour of Code

Computer Science Education Week has been recognized in December each year. The timing selected to coincide with the birthday of Grace Hopper, a pioneer in computing. Every year during Computer Science Education Week, classrooms around the world plan activities to participate in the Hour of Code, to inspire everyone to explore the possibilities and opportunities available through coding. But this year the plans may be a little bit different. There has been a shift to focusing on the Hour of AI.

Over the past three years, AI has continued to advance and bring more tools into our classrooms and the world. There are so many possibilities available when it comes to AI and coding and the technology has continued to improve. Now, through a collaboration between Imagi Labs and Lovable, educators and students can dive into coding, without even writing a single line of code. It sounds impossible but it is true. Code is written by educators and students, simply by describing what they want. This is Vibe Coding. And the best part is that you don’t need to have a background in coding to be able to get started! My recent experience with Lovable and Imagi has shown how easy it is to build an app, create a game and more, by simply using natural language prompts. (Sign up to learn more during the Tuesday, December 9th webinar here).

And when it comes to AI, there has been a valid concern around data privacy. With Imagi and Lovable, it is easy to get started without the need for sharing student data or involving a time-consuming and complex setup. Vibe coding and the resources available help to promote computer science and AI literacy in all classrooms and focus on healthy and intentional use of AI.

So What is Vibe Coding?

Vibe Coding is way to dive into coding without writing lines of code. Rather than writing out lines of code, you simply use words to describe the vibe of the program that you want to create and then AI helps to build it. Think about what happens with prompting. With vibe coding, you use natural language prompts to describe the kind of game or app you want to create, and then AI takes care of the task of generating the code. With my more recent experiences, I’ve explored Imagi and Lovable, which is an AI-powered platform that lets anyone (with or without coding experience) create websites, apps, and games by simply describing them.The focus of coding shifts to the wording and then the ideas turn into a working project. You spend time considering the concept, refining the descriptions, and iterating throughout the process.

I have used Imagi Labs for over a year and now, with the new learning experience via vibe coding, I have more ways to focus on Computer Science and AI literacy. Imagi has partnered with Lovable to make vibe coding more classroom-friendly and easier to get started. Through Imagi, educators have access to ready-made curriculum and a special school-safe mode for Lovable that does not require personal student accounts. So now all students can join in an Hour of AI activity safely and experience AI-driven coding, which educators can facilitate with more comfort and confidence.

Why Hour of AI and Vibe Coding?

The Hour of AI is an evolution of the Hour of Code, which I have participate in with my students for years. Initially I thought about it as just an hour, but the reality is that it is meant to be an hour that then inspires you to continue to bring coding and computer science opportunities into all classrooms. There is a growing need to build foundational AI literacy skills in addition to computer science skills, in order to prepare students for the future. Through these resources, whether Hour of Code or Hour of AI, the goal is to show students that anyone can explore AI and coding.

Vibe coding is the perfect activity to explore because it makes it even easier. I think about it like this: if you and your students can write a sentence, explain a concept, then you can start creating with code. Vibe coding does not require prior coding experience. Through Imagi and Lovable, there are tutorials that provide proof that anyone can learn to code and they can do so in a fun, AI-powered way. Commonly referred to as a plug-and-play, I think it is another great opportunity for the Hour of Code/AI season this year! And, to learn how to use it, join us for a great conversation and demo!

A peek at Tuesday’s webinar.

Creative Coding

What I have always enjoyed during the Hour of Code activities or Computer Science Education Week activities, are the reactions of the students! Whether they build a game or just learn more about coding and become excited about the possibilities, it is always a great learning opportunity for them and for me too.

With opportunities to build and customize their own video game, it draws them right in. The specific project they’ll create is totally up to them, which sparks creativity and builds confidence and excitement in learning. What makes it even better is how students build it. Simply by typing their ideas in plain text, through a prompt, they end up with code that is quickly generated. For example, a student might start with a prompt like, “Create a game where a cat catches falling treats and earns points.” Lovable’s AI will take their prompt and generate an initial game which may have a cat sprite at the bottom of the screen that you can move, and treats dropping from the top. Students then test the game to see how it works and collaborate to improve it.

From there, the creative iteration kicks in. Maybe one student wants the game to be about space, not fruits. They just need to ask the AI to switch the theme. Typing in “Change it to a space game catching asteroids instead of treats.” Starting with games to have students catch items is a great way to get started and because students’ games can be adapted and relevant to any subject or story, the activity will help to engages their personal interests and connect meaningfully with classroom content. The AI takes care of the coding, but students remain the designers, guiding the outcome with their descriptions. And this is how we move them from consumers to creators and innovators!

This process also introduces the concept of prompt refining and debugging in a very digestible way, especially if they are limits in the number of prompts they can use. It requires them to really think through and be specific. Once generated, if the game doesn’t run exactly right on the first try, students then learn to tweak their description by adding more details. They may say to move an item faster or change the color to a lighter shade. Students work on debugging by having a conversation with the AI, which helps them to problem solve too. Students learn how to write prompts and debug creatively while building their game and it results in less frustration and instead sparks curiosity. Students can consider: What happens if I ask the AI to do this? How can I change the appearance of the characters or the background? for a few examples.

Students can publish or share their game, which they always enjoy! For some students, this may be the first time they’ve coded something playable, which is a huge confidence boost and hopefully the moment they realized that coding (and AI) can be creative, fun, and most importantly, something that everyone can do. And another benefit is the collaboration that happens. Want to join us and learn together? Sign up here for our livestream happening Tuesday!

Building AI Literacy and CS Skills

Beyond the excitement of making a game, vibe coding activities provide impactful instructional value. It aligns with traditional computer science foundations and emerging AI literacy standards. Lessons available have been mapped to AI Literacy competencies from the AILit framework, including skills to Evaluate, Create, and Design with AI.

  • Evaluate: Students practice critical thinking by examining what the AI produces and deciding if it’s acceptable or needs some tweaking. For example, if the AI’s first attempt has a bug or the theme is slightly off, students must decide whether to accept the result, refine their prompt, or start again. Students learn to question the AI output rather than trust it immediately, which is a key AI literacy skill they need to develop.
  • Create: Rather than simply playing and consuming a game, students can now collaborate with generative AI to create one. They continue to refine the results and reflect on how their prompts (their thought processes) lead to different outcomes It’s an easy way to introduce how human creativity and AI can work together, rather than have AI replace their thoughts. Students see that AI can assist their creativity, but that their own ideas and adjustments actually are behind the project.
  • Design: By the end, students are able to describe how an AI system like Lovable helped them to build a solution to a problem or project idea. They realize that they have designed a simple software product by leveraging AI and how AI tools might help solve problems in any field. I think this is a great way to engage students in a discussion in any subject or to focus on community issues. A focus on designing with AI for real-world contexts.

Using these tools, students are learning classic computer science concepts in an age-appropriate way. They understand algorithmic logic (the game has rules like “if the cat catches treats, the score increases”), and they practice testing and debugging (when their game doesn’t work as expected, they try again and iterate). The difference is that the AI handles the syntax and heavy coding, which allows students to focus on logic and the game design. It is truly empowering for younger learners and for any learner that may hesitate to try traditional coding. Now, they learn to code in a way that breaks down the challenges that may come from receiving coding errors.

Teacher Support

Trying a new tech tool in class can be time-consuming, but Imagi + Lovable make it easy to dive in. There are a variety of teacher supports available to help teachers feel prepared and confident, even if it’s the first time exploring AI and coding in the classroom. A few of the features:

  • Detailed Lesson Plan: A step-by-step lesson guide is provided, outlining the learning objectives, timing for each part of the activity, discussion questions, and potential student responses. It’s basically a script you can follow or adapt.
  • Slide Deck: There are ready-to-use slides designed for projecting in class while you run the Hour of AI. They introduce key concepts (like “What is AI?” and “What is vibe coding?”), show visual examples, and include prompt examples to guide students. There are also speaker notes.
  • Account Setup Is Simple: Imagi handles creating student accounts for Lovable with one click. The focus is on privacy-first (accounts are anonymous and expire after the event).
  • Troubleshooting Help: Technology is great until it isn’t. But for this, don’t worry because the Hour of AI pack includes a troubleshooting guide for common issues.

There are more supports available! –> Sign up here for our livestream happening Tuesday!

By participating in this event and exploring Vibe coding during the Hour of Code/AI, we are helping students build foundational AI literacy in an engaging way.

If you’ve been thinking about coding and AI, then Computer Science Education Week and the Hour of AI are the perfect time to dive in. Set aside an hour for vibe coding and see the impact when students see their ideas come to life.

Ready to get started? Join the webinar or sign up to get the recording and resources!

Let’s work on fostering creativity and building AI literacy for every student…one vibe at a time!

About Rachelle

Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!

Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.

She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.

Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, Cybersecurity, STEM, AR/VR, and more for your school or speaking event! Submit the Contact Form.

Follow Rachelle on Bluesky, Instagram, Threads, and X at @Rdene915

**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks

************ Also, check out my THRIVEinEDU Podcast Here!

Join my show on THRIVEinEDU on Facebook. Join the group here.