Not long ago, artificial intelligence in education felt novel. It was something shiny, experimental, and, for many educators, possibly unsettling at times. When ChatGPT arrived in November 2022, the initial conversations and concerns were more focused on fear. I recall receiving emails, text messages, phone calls, and visits from educators who were concerned about cheating, plagiarism, lost skills, and what instantly felt like an overwhelming pace of change. It was something else to adjust to, not long after the overwhelming feeling that many felt in March of 2020.
But since that initial adjustment to the increased use of AI in our world at the end of 2022 and through 2023, I’ve seen a shift happening. At first, there was skepticism, uncertainty, and hesitation, and not just in the world of education. However, as we’ve continued to adjust to new tools and new ways of working, I’ve noticed a shift from considering AI as a “what if” to the acceptance that AI is here and its use is increasing. It’s embedded in tools educators already use, and if it hasn’t already, then it will potentially slowly but surely become part of the daily routine and workflow of teaching and learning.
I’ve spoken about this shift from novelty to normalcy and how it brings a new challenge: educator upskilling.
A few years ago, I started researching the training available to educators and other professionals in AI. At the end of 2023, 87% of the educators in the United States had not received any training. In my workshops, some attendees are having their first training experience, more than 3 years after ChatGPT made its debut. So I think that we need to focus on an important question, whether in education or not. The question is no longer whether educators need professional learning around AI. Most people agree that they do. The bigger issue is whether we are approaching AI professional development in ways that are deep, sustained, and human-centered, or whether we’re still experiencing the one-and-done sessions that barely scratch the surface. With AI and the pace of change in education and the world, we need to do better and be prepared.
Shifting to Ongoing Capacity Building
When I completed my doctorate nearly two years ago, my research focused heavily on professional learning in emerging technologies, with a strong emphasis on AI. Even then, the message was clear. A single PD session, or even a series of short, tool-based trainings, was not enough, especially if completed early in the year or during a limited time span.
Yet, that is what I am learning about how AI PD is structured today. Through surveys in my sessions and conversations with other educators, there is a common experience happening, which is:
A 30-minute overview.
A 15-minute “certified educator” badge.
A walkthrough of one tool done well.
While these experiences can be helpful, especially for getting started and when time is limited, in the long term, they don’t build AI literacy. They build familiarity, whether with AI concepts or an AI tool. But familiarity is not AI literacy. Not for us as educators, nor for the students we are preparing for a future surrounded by AI and a world of work that seeks employees skilled in AI.
Continue reading the original post on Getting Smart.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
During our ThriveinEDU livestream conversation about Kira, we explored a question that immediately resonated with educators:
What if planning, grading, and differentiation actually took half the time and still kept teachers in control of learning?
The question isn’t just about efficiency. It’s about sustainability and about supporting teachers to make instruction more responsive, more personalized, and more aligned to what students actually need in the moment, real-time responses, authentic feedback, and support from their teachers.
Kira recently released several new features (as part of their Kira 2.0 launch) that move beyond treating AI as a “lesson generator” or “assessment creator,” and it now works as a thought partner in the instructional workflow. After attending the Live Launch in New York on March 3rd and moderating the livestream, here are some of the biggest takeaways from the conversations that make the newest updates especially impactful for classrooms now.
Lesson/Course Studio
Many AI tools help teachers create one lesson at a time, which is highly beneficial and time-saving. But imagine you’re tasked with creating a course you’ve never taught or don’t have enough resources for. The amount of time needed is a bit overwhelming.
Kira’s Course and Lesson Studio helps educators generate both structured lessons and full, standards-aligned courses, including course outlines, unit sequences, lesson progressions, and assessments
Educators need to provide the topic, subject, grade level, and standards, and then, using this information or prompt, Kira builds the lesson with embedded formative checks already in place.
Formative assessment often happens after instruction, with Kira, teachers see student understanding during instruction.
As Rachel shared during the livestream:
“I don’t remember a time when I wasn’t taking work home or trying to get ahead of the game by planning out my week and then having to rewrite it midweek. It was so much work.”
Kira’s curriculum-building features help reduce that cycle in far less time. Rather than rewriting lessons to meet student needs, teachers start with a flexible structure they can adapt immediately, and, most importantly, stay in control. We are doing the editing, adjusting, and shaping of the lesson. This is an important distinction to make because it shows how crucial it is that teachers remain involved and review what has been generated.
Real-Time Insight Instead of End-of-Unit Surprises: Student Atlas
I have known about this for a few months and thought it was amazing. One of the most exciting updates in Kira 2.0 is Student Atlas, the platform’s student insight dashboard, now paired with Class Atlas, which brings those insights together at the class level.
Student Atlas provides:
concept-level mastery tracking
data confidence indicators
individual student support indicators
zones of proximal development insights
intervention suggestions
Rather than relying on a single quiz or test score, teachers can see which concepts students understand and where they’re struggling in real time. It enables us to see what concepts need reinforcing now, rather than waiting until the assessment is over and graded.
Class Atlas builds on this by turning individual insights into a clear, actionable class-wide view. Instead of opening 20+ student profiles and piecing things together, teachers can instantly answer: Where should I focus my instruction? and Which students need help with this skill? Teachers can even ask Kira to explain how it generated its recommendations, which helps schools as they look for tools and want to trust AI technologies.
Student Atlas also includes a data confidence indicator, helping educators assess the reliability of recommendations before making instructional decisions. That transparency supports professional judgment instead of replacing it.
Standards Alignment
Standards alignment is often one of the most time-consuming parts of planning, especially when building units or courses. And for educators teaching multiple courses, it is very time-consuming. But with Kira 2.0, that time requirement decreases because Kira 2.0 automatically tags lessons, activities, assessments, and questions to state standards, underlying skill progressions, and Bloom’s taxonomy levels.
Teachers can track how students are progressing through skills over time.
Supporting Multilingual Learners
Another standout feature we spoke about in the livestream is Kira’s built-in support for multilingual learners.
When gaps in understanding appear, Kira can generate:
scaffolded practice
targeted follow-up lessons
leveled reading supports
vocabulary scaffolds
translated instructional materials
Each of these supports is based on individual student performance, and not on a generic template that does not align with the student’s needs.
Differentiation is responsive rather than being reactive.
During the livestream, we talked about how, historically, differentiation required teachers to manually create multiple versions of lessons or assessments, which, of course, took a lot of time. With Kira, these supports are embedded directly inside the instructional workflow. Rachel said, “Especially talking about differentiation and the ease of it and being able to have the assistant nearby and go back and forth.”
Embedded support assists educators in providing what each student needs while giving them more time to work directly with each student.
Kira provides structure, but the teachers are the designers who provide the course’s vision.
Kira brings planning, assessment, differentiation, and student insight into one connected space. And when those pieces connect, teachers gain something incredibly valuable:
clarity flexibility time and better visibility into learning
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
What are GenAI technologies, and what do we want them to become? Right now, GenAI is an educational chameleon, aggressively marketed as an indispensable learning companion, an academic partner, and a labor-saving tool; and at the same time, widely critiqued as a dangerous source of misinformation and biased responses, an environmental degrader, and a privacy invader. Since GenAI is all of these things and more, how do we use these tools appropriately and thoughtfully?
What GenAI is and what it will become depends on YOU – how you think about its roles, use it in your teaching and learning, and describe its functions to others.
Let’s look at two currently popular descriptions and uses of GenAI: 1) GenAI as a companion; 2) GenAI as a productivity-enhancing tool.
First, GenAI is widely described and used as a supportive “companion” or helpful “partner.” The Harvard Business Review (2025) reported that therapy/companionship was the number one way people were using GenAI in 2025. An alarming number of teens acknowledge that GenAI chatbots are their virtual companions, even though this technology can exploit youngsters’ emotional needs in ways that lead to self-harm and other risks (Common Sense Media report, Robb & Mann, 2025). One of the key problems here is that GenAI is NOT human, and it is not even intelligent (at least in the way humans perceive and describe intelligence).
The Key Takeaway: Using terms like “partner” or “companion” to describe GenAI technologies humanizes tools that are not designed to provide the support, guidance, and level of intelligence that actual humans can provide.
Second, GenAI technologies are widely presented as productivity-enhancing, time-saving, efficiency-increasing tools for people to use to improve their lives. “Use ChatGPT to make life easier,” declared a recent email advertisement, where all one had to do was “just tap a chat to start.” Personal and professional productivity is also one of the top ways people are using GenAI technologies – from writing emails and reports, to planning vacations and meals, to studying for exams; and it is certainly true that GenAI technologies can do all these things and so much more really fast. Yet, personal autonomy, creativity, and agency is lost when one uses GenAI technologies to automate activities they formerly did without it.
The key takeaway: Avoid talking about GenAI as automating work and think directly about how it can augment or supplement your activities as a teacher and a learner.
So If not a human-like companion or a productivity-enhancing automation tool, then how can we think about the role of GenAI in education? We believe that GenAI is best used when it augments teaching and learning, kind of like the way a caddie in golf enhances the golf experience. As such, we offer a metaphor of GenAI as a caddie; but again remind you that it is not an actual caddie and we are not trying to humanize this tool.
Professional golfers and their caddies on the LPGA, PGA, and more than 20 professional golf tours worldwide offer a metaphor for thinking about, describing, and using GenAI. Each pro golfer has a caddy who carries their clubs and walks alongside them when they play competitive tournaments. sharing ideas and information about the shots they are playing. For instance, until recently, LPGA player Brooke Henderson’s caddy was her older sister, Brittany; PGA player Xander Schauffele’s caddy is Austin Kaiser (his college golf teammate at San Diego State University).
Caddies have detailed information about the course and provide suggestions and feedback about what shots to hit with which clubs. They help keep track of the pace of play and how conditions of the course may be changing due to wind, weather, and time of day. However, it is the golfer who remains totally in charge of the outcomes of the game. Caddies do not hit the golf ball; golfers do not always do what the caddy suggests. It is the golfer who must make decisions, hit the shots, and deal with consequences, both positive and negative, in terms of performance and score. Caddies are there to augment the golf experience and outcome.
When it comes to teaching and learning, GenAI can be that source of information, ideas, or inspiration like a caddie; and it is the teacher who must determine what to do with that information. They have the expertise; they understand their classroom dynamics and contexts; they know their students, their topic, their grade level, and their community.
The key is for the teacher to resist the temptation to automate their work by turning it entirely over to a GenAI technology, because in this case GenAI is in control of the shots, rather than the teacher. It is as if professional golfers let their caddies choose the club and then hit the ball for them. This is even more problematic when it comes to using GenAI to automate tasks. In our metaphor, the caddie is a human who has expertise and has played golf before; however, GenAI is not a teacher, has never taught, and has no idea what teaching is. Turning over any tasks to a tool that does not have any expertise in education can become really problematic. Teachers must maintain agency and exert control, deciding when to accept, when to reject, and when to modify whatever ideas and information the GenAI provides.
So, returning to our original statement, what GenAI is and what it will become depends on YOU – how you think about its roles, use it in your teaching and learning, and describe its functions to others. What do YOU want GenAI to be?
Nearly 50 years ago, at the outset of the computer revolution in schools, Seymour Papert asked: Will computers program the child, or will educators create the conditions where children program computers? For Papert then, as for us today in the age of GenAI, using technology remains a question of human control and user agency. GenAI can provide amazing resources, but it is essential that you retain your decision-making and personal creativity. Only then will the results be truly yours.
Torrey Trust, Ph.D., is a Professor of Learning Technology in the College of Education at the University of Massachusetts Amherst. Her work centers on empowering educators and students to critically explore emerging technologies and make thoughtful, informed choices about their role in teaching and learning. Dr. Trust has received the University of Massachusetts Amherst Distinguished Teaching Award (2023), the College of Education Outstanding Teaching Award (2020), and the International Society for Technology in Education Making IT Happen Award (2018), which “honors outstanding educators and leaders who demonstrate extraordinary commitment, leadership, courage, and persistence in improving digital learning opportunities for students.” More recently, Dr. Trust has been a leading voice in exploring GenAI technologies in education and has been featured by several media outlets in articles and podcasts, including Educational Leadership, U.S. News & World Report, WIRED, Tech & Learning, The HILL, and EducationWeek. www.torreytrust.com
Robert W. Maloy is a senior lecturer in the College of Education at the University of Massachusetts Amherst, where he coordinates the history teacher education program and co-directs the TEAMS Tutoring Project, a community engagement/service learning initiative through which university students provide academic tutoring to culturally and linguistically diverse students in public schools throughout the Connecticut River Valley region of western Massachusetts. His research focuses on technology and educational change, teacher education, democratic teaching, and student learning. He is co-author of AI and Civic Engagement: 75+ Cross-Curricular Activities to Empower Your Students, Transforming Learning with New Technologies (4th edition); Kids Have All the Write Stuff: Revised and Updated for a Digital Age; Wiki Works: Teaching Web Research and Digital Literacy in History and Humanities Classrooms; We, the Students and Teachers: Teaching Democratically in the History and Social Studies Classroom; Ways of Writing with Young Kids: Teaching Creativity and Conventions Unconventionally; Kids Have All the Write Stuff: Inspiring Your Child to Put Pencil to Paper; The Essential Career Guide to Becoming a Middle and High School Teacher; Schools for an Information Age; and Partnerships for Improving Schools.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
In Part 1, I shared why understanding the legal landscape of artificial intelligence is essential as schools continue to explore how these tools can support teaching and learning. Schools everywhere are thinking through policies and how to best provide resources for educators, students, and families. Awareness of laws such as FERPA, COPPA, and GDPR, accessibility requirements, and concerns such as algorithmic bias and deepfakes set an important foundation for responsible implementation.
We need guidelines and guardrails. A common question I hear from educators and leaders after presenting sessions and workshops, or speaking at conferences, is: “What do we do next?”
Understanding the guardrails is only the first step. The real work begins when schools start building systems that support educators in applying this knowledge in practical, sustainable ways. And it requires true collaboration.
Responsible AI Adoption Is a Team Effort
One of the most important shifts happening right now is the recognition that AI adoption and policy development should not be the responsibility of a single person or a select few administrators or IT teams. Responsible implementation and policy development require collaboration across roles.
District leaders are shaping policy and expectations for the school community.
Technology teams are evaluating vendor compliance and infrastructure readiness. (I have a future post coming up about IT Teams and ongoing PD).
Instructional leaders are aligning tools with learning goals and supporting teachers with implementation.
Teachers are modeling and supporting ethical classroom use.
Students are exploring and developing AI literacy skills that will shape how they interact with technology throughout their lives.
What I truly believe is that when schools recognize AI is a shared responsibility rather than an isolated initiative, implementation becomes more intentional, reflective, and sustainable.
I consistently see this when working with districts across the country. The schools that are moving forward with confidence are not the ones adopting the most tools. They are the ones creating a community, developing a common language, and building shared understanding first.
Transparency Builds Confidence Across the Community
Another theme that has been coming up in conversations with educators and families is trust.
Families want and need to know:
What tools are being used?
What information is being collected?
How is student data protected?
How is AI, or any technology, being used in support of learning rather than replacing it?
Having clear answers to these questions helps to strengthen the essential partnerships between schools and families. It also creates opportunities for students to participate more actively in conversations about responsible technology use.
Transparency is not simply a compliance strategy. It is a relationship-building strategy. When schools communicate clearly and proactively, they reduce uncertainty and help communities better understand how innovation supports student success.
AI Literacy Is Now Part of Digital Citizenship
One of the biggest shifts happening in education right now is the expansion of digital citizenship to include AI literacy. We’ve been talking about media literacy, digital literacy, AI literacy, and even discernment. Our work is a bit more involved now, and we need to be prepared.
Students are already interacting with AI systems daily, both in and maybe more frequently outside of school. They need guidance, which means classrooms must play an essential role in helping students understand:
How to protect their personally identifiable information (PII)
How AI systems generate responses How bias can appear in outputs How misinformation spreads How data is collected and used How to evaluate whether a tool should be trusted
AI literacy is not about teaching students how to use a single platform. It is about helping them develop judgment.
When students learn how to ask better questions about technology, they become more confident learners and more thoughtful digital citizens. Emerging tools continue to shape how students research, communicate, and create, and as educators, we have to keep learning so we can guide them to use the tools available to them safely and successfully.
Accessibility and Equity at the Center
As schools explore AI tools, accessibility must be a part of every conversation.
AI has tremendous potential to support multilingual learners, provide personalized feedback, assist with reading and writing tasks, and help students access content in new ways. It has endless ways to support educators. Schools must continue evaluating whether tools meet accessibility expectations and support equitable learning experiences.
Responsible implementation means asking questions such as:
Does this tool improve students’ access?
Does it create barriers? There has been more talk about the digital divide recently.
Does it support multiple learning pathways?
Does it align with universal design principles? Or a Portrait of a Graduate or an AI-Ready graduate?
Technology should expand opportunity rather than narrow it.
Supporting Educators Through the Transition
One of the most encouraging things I have seen in my work with educators is their investment in learning and the desire to learn with and from their students.
Educators are exploring AI tools while also asking important questions about privacy, ethics, and instructional impact. This balance is exactly what responsible adoption should look like.
Professional learning plays an essential role.
Educators benefit from opportunities to:
Explore tools safely Review privacy expectations Understand policy implications Design classroom strategies Collaborate with colleagues Develop shared language around responsible use
When professional learning includes both legal awareness and classroom application, educators feel more confident making decisions that support students. Confidence leads to stronger implementation. And this is the work I am most passionate about when working with schools.
Leadership Matters More Than Ever
School leaders are in a unique position to support responsible AI adoption by:
Developing clear expectations Supporting cross-team collaboration Communicating with families (consistently) Reviewing vendor agreements carefully Building a common language around the use of AI Creating space for experimentation, but having guardrails in place
Moving Forward
Artificial intelligence is already part of the learning landscape. We should not be talking about whether schools should engage with AI, but rather deciding how they will engage with it.
When schools combine legal awareness, transparency, accessibility considerations, and strong professional learning structures, they create innovative environments built on human decision-making.
Students benefit when educators feel confident.
Educators benefit when leaders provide clarity.
Communities benefit when schools communicate openly.
Responsible AI adoption is about moving forward with purpose.
When schools take that approach and have a team to work with, they are preparing students to understand technology, question it, and be the ones who determine what comes next.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Artificial intelligence (AI) is rapidly transforming education. From lesson planning support to personalized learning pathways and administrative efficiencies, AI tools are a more common part of everyday classroom practices. At the same time, the speed at which this technology has advanced and been adopted into classrooms has led to understandable uncertainty among educators, leaders, and families who are asking important questions. These groups are concerned with the data that is being collected, who owns AI-generated work, and what responsibilities schools have when students and educators use these tools.
As both an attorney and educator who has spent more than eight years researching, teaching, presenting, and writing about AI, I have worked with schools across K–12 and higher education that are navigating these exact questions. The legal implications of AI are not barriers to innovation, but I consider them to serve as guardrails that assist schools with adopting technology responsibly. The key is protecting students, educators, and institutions and staying informed. Understanding the legal landscape and any potential legal implications as a result of the use of AI in classrooms helps schools move forward with confidence rather than hesitation.
Why AI and the Law Matter in Education
AI relies on data in order to function effectively. When it comes to schools, this means having access to student information, classroom artifacts, writing samples, images, and even data related to physical or behavioral information. Intent is not the deciding factor. Even if educators believe they are only sharing minimal information, that does not clearly identify a student, family member, or colleague, even seemingly harmless details can qualify as personally identifiable information (PII).
I’ve often spoken about some examples like referencing a favorite restaurant, a local landmark, a pet’s name, or an extracurricular activity, all of which could make a student identifiable when combined with other data points. Last year, an educator in one of my sessions said, “Enough stars to still form a constellation,” and that has stuck with me and I have shared it in each AI and the Law session I have done. That is why evaluating tools carefully and teaching students to do the same are essential. I often reference scavenger hunts, in that educators should not feel like they are on a scavenger hunt when trying to find out what happens to their information. We need transparency from vendors so that educators are aware and informed.
AI is also changing how decisions are made in schools. With many advances, there are recommendation systems, automated feedback tools, and predictive analytics that can influence learning pathways, grading practices, and student support services. Having an understanding of how these systems work and how they should be used responsibly is becoming part of educators’ and school leaders’ professional responsibilities.
Key Laws That Shape AI Use in Schools
There are several important laws that guide how schools must approach AI.
FERPA (Family Educational Rights and Privacy Act) protects the privacy of student education records. When schools use AI-powered platforms that process student work or store learning data, they must ensure that these tools comply with FERPA requirements and clearly define how student information is handled.
COPPA (Children’s Online Privacy Protection Act) applies to students under the age of 13 and requires parental consent before collecting personal information through online services. Because many AI tools rely on user-generated input, COPPA compliance becomes especially important in elementary and middle school settings.
GDPR (General Data Protection Regulation), although it is a European Union law, is relevant to U.S. schools that use tools developed by companies that operate internationally. There are many platforms created outside of the United States that educators may be unaware of, and so understanding GDPR is essential. Many platforms now include cookie permissions and data-use customization features in response to GDPR requirements. These protections often benefit schools globally.
Schools should also consider state-level student data privacy laws, which are increasingly changing the expectations for vendor contracts, third-party integrations, and data retention timelines. District leaders and IT teams play an essential role in ensuring these requirements are addressed before tools are introduced into classrooms.
Data Privacy and Vendor Responsibility
AI tools require large amounts of data to function effectively. That data may be used to improve the tool itself, train additional models, or support integrations across connected platforms. Even when a tool states that it does not share user data, connected services or embedded features may still interact with stored information. I was asked two years ago, when speaking at LACOE in California during my AI and the Law session, if someone should “trust the platform when it says they do not share or store the data.” My instant answer was “No.” And it was for this exact reason.
Before introducing any AI platform in schools, educators and school leaders should review terms of service, privacy policies, and compliance documentation. Look for references to FERPA, COPPA, and additional privacy protections. Look for the date that the privacy policy was most recently updated. Districts should also confirm whether vendors use student information to train future AI models and whether contracts clearly define ownership and storage expectations.
This is where collaboration with district technology teams becomes essential. Responsible adoption is not an individual teacher’s decision. It is a system-level responsibility supported by leadership, policy teams, and instructional staff working together. Collaboration is key.
Transparency Builds Trust With Students and Families
Responsible AI adoption depends on communication. Families deserve clear explanations of the tools being used, the data being collected, and how that data is protected.
When working with students under age 13, written parental consent may be required. Even when it is not legally necessary, providing families with opportunities to ask questions strengthens trust and partnership. Transparency also empowers students. When students understand how AI systems work and the risks they may pose, they become more thoughtful digital citizens and more informed users of technology.
Schools that proactively communicate expectations for AI use are more likely to build families’ confidence and reduce misunderstandings about how these tools support learning.
Accessibility, Equity, and Emerging Legal Considerations
As schools adopt AI tools, accessibility and equity must remain part of the conversation. Laws such as Section 504 ofthe Rehabilitation Act and the Americans with Disabilities Act (ADA) require that digital learning tools be accessible to all students. If AI-powered platforms create barriers rather than support access, schools may face compliance concerns. We need to consistently audit the tools we are using. It must be an ongoing process.
Schools must also consider how AI intersects with Title IX responsibilities, especially with the rise of deepfake technology, which leads to new risks related to harassment and impacts student safety. Policies must be in place for addressing the misuse of generative AI tools and clearly define expectations and response procedures.
Algorithmic bias and fairness are important parts of the conversation. Schools should evaluate whether AI systems produce equitable outcomes across student groups and whether automated recommendations influence learning opportunities in unintended ways. Responsible implementation includes ongoing evaluation, not just initial approval.
Teaching Digital Citizenship With AI Literacy
Legal compliance alone is not enough. Students must also develop the skills needed to evaluate AI responsibly.
Developing skills in these areas means recognizing risks such as deepfakes and misinformation, bias in generated content, and cyberbullying that is supported by emerging technologies. Schools that integrate digital citizenship with AI literacy will guide students to become thoughtful participants in technology-rich environments rather than passive users who lack true understanding and AI literacy skills.
Clear expectations around appropriate use and academic integrity help students develop ethical decision-making skills that extend beyond the classroom.
Supporting Schools and Organizations Through AI and Legal Guidance
As AI adoption accelerates, schools will benefit from having a structured support system in place that connects legal awareness with thoughtful and purposeful classroom practice. Through my work with educators in K–12 and higher education, I provide professional learning experiences that help schools understand privacy requirements, implement responsible AI strategies, and align classroom applications with policy expectations.
My work includes keynote presentations, workshops, district leadership sessions, curriculum planning support, and customized training focused on data privacy, academic integrity, digital citizenship, accessibility considerations, vendor evaluation, and responsible AI adoption. Each training is tailored to address specific needs, ranging from introductory awareness sessions to deeper implementation planning and leadership strategy development.
In addition to supporting schools and universities, I work with organizations across other sectors to explore how to implement AI responsibly while remaining aligned with legal expectations and organizational values. Many industries face the same challenges that educators do, surrounding uncertainty about data privacy, questions about intellectual property ownership, concerns about transparency in decision-making systems, and the need to develop policies that support ethical innovation. My work helps organizations evaluate tools thoughtfully, identify potential risks early, and create practical guardrails that support responsible adoption rather than reactive compliance.
Organizations in healthcare, legal services, workforce development, nonprofit leadership, and corporate training environments are increasingly recognizing the importance of AI literacy for employees at every level. Through workshops, leadership sessions, and strategy conversations, I help teams understand how AI systems work, the legal considerations that may be applicable to them, and how to build cultures of responsible use that prioritize trust, security, and human judgment.
Moving Forward With Confidence
Artificial intelligence is already shaping how students learn, communicate, and prepare for future careers. The goal is not simply to adopt AI tools, but to adopt them responsibly. And this is where our work as educators comes in and why we need to dive in and learn with and guide our students.
When educators understand the legal landscape of privacy, accessibility, intellectual property, and ethical use, they can make informed decisions that support innovation and student protection. With thoughtful planning, collaboration, and transparency, schools will create learning environments where AI enhances opportunities while maintaining trust, safety, and integrity across the entire school community.
I work with schools and organizations, both in person and virtually, to support thoughtful and responsible AI implementation through professional learning, curriculum design, and resource development specific to educators, students, and families, using a common language. I have also collaborated with leadership teams to develop AI guidance frameworks, classroom-ready activities, and policies that reflect legal considerations.
The resources created help districts communicate clearly and consistently with families about AI use, support educators in building AI literacy, and provide students with age-appropriate strategies for using AI safely, ethically, and responsibly. By combining legal insight with classroom experience, I help schools move beyond uncertainty toward sustainable systems that include clear expectations, transparency, and actionable guardrails for responsible use.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
People approach learning in many different ways. Some individuals prefer reading detailed information, while others respond well to visual demonstrations or interactive activities. Discussions about learning often refer to the idea of “learning styles.” As we explained in our Learning Styles article, this is a concept that perpetuates the idea that certain individuals learn better when information is presented in their preferred style of learning. This concept highlights the importance of exploring multiple ways to engage with information.
Interactive environments can provide powerful opportunities for learning and skill development. Gaming has become one of the most widely studied forms of interactive engagement. Researchers increasingly examine how digital games stimulate cognitive processes that support attention, memory, and decision making. For lifelong learners who enjoy exploring new challenges, gaming provides an environment that encourages constant adaptation and information processing.
What the Latest Research Says About Gaming and Cognitive Performance
Recent academic research continues to explore the connection between gaming and cognitive performance. A study published in Frontiers in Psychology in 2025 examined how gameplay activities support cognitive development across different age groups and educational backgrounds. Researchers found that many games involve rapid information processing, pattern recognition, and strategic thinking. These activities activate multiple cognitive systems that support learning and decision making.
Additional reporting has highlighted how video games engage players in tasks that require coordination, attention, and problem-solving. The Guardian used data to show that gaming environments often encourage players to react quickly to changing conditions while evaluating information on the screen. These interactive challenges stimulate the brain’s capacity for concentration and flexible thinking.
Different gaming genres contribute to cognitive engagement in unique ways. Each format presents players with challenges that involve analysis, planning, and observation. Exploring a variety of genres allows players to experience multiple types of cognitive stimulation.
Puzzle and Logic Games
Puzzle games focus on pattern recognition, spatial reasoning, and problem-solving. Players interact with shapes, sequences, or logical relationships that require careful observation. Popular examples include Tetris, where players rotate and position falling blocks to complete lines, and Portal, a physics-based puzzle game that challenges players to solve spatial problems using teleportation portals.
Many puzzles require players to identify patterns or relationships between different elements on the screen. Mobile titles such as Candy Crush Saga ask players to match symbols in strategic ways to clear the board, while games like The Witness present environmental puzzles that reward careful exploration and logical thinking. Solving these challenges often involves experimentation and deduction, with each step building toward a clear solution.
Puzzle-based gameplay encourages players to process information methodically while exploring different approaches to solving a challenge. These experiences stimulate attention and reasoning skills through repeated observation and problem-solving.
Action and Adventure Games
Action and adventure games emphasize quick reactions and spatial awareness. Players often navigate dynamic environments where movement, timing, and observation determine success. Well-known examples include The Legend of Zelda: Breath of the Wild, where players explore large landscapes while solving environmental challenges, and Uncharted, which combines exploration, platforming, and action sequences.
The Frontiers in Psychology research highlighted how action-oriented games require players to track multiple visual cues while responding rapidly to new information. Titles such as Call of Duty encourage players to monitor fast-moving environments while making split-second decisions during gameplay.
Players often move through complex environments while evaluating risks and opportunities. Games such as Assassin’s Creed place players in detailed historical settings where navigation, timing, and observation guide progress through missions. This constant interaction with visual information supports attention and coordination during gameplay.
Online Casino Games
Online casino games also offer cognitive engagement through decision-making and observation. Players often analyze patterns, evaluate probabilities, and choose strategies that influence how they interact with each game.
Platforms such as Jili illustrate how modern casino eGames incorporate a variety of gameplay styles. Jili slot titles offer a range of volatility levels that appeal to different preferences. Some players enjoy steady gameplay rhythms, while others prefer higher intensity experiences that involve larger swings in outcomes. The platform also offers diverse themes that allow players to switch between different styles of gameplay. Titles range from classic casino-inspired designs to skill-based arcade-style challenges. This variety encourages players to explore different game environments while adapting their approach to each format.
Switching between themes and mechanics introduces new visual patterns and gameplay systems. Players engage with each game by observing results, responding to gameplay cues, and managing their choices during each round.
Strategy and Simulation Games
Strategy and simulation games involve long-term planning and resource management. Players evaluate information, compare potential actions, and make decisions that influence future outcomes. These gameplay systems encourage careful analysis and strategic thinking. Popular examples include Civilization, where players guide a society through centuries of development, and XCOM, which challenges players to manage resources and coordinate tactical missions against alien threats.
Players frequently track multiple variables during gameplay, such as resources, time limits, or environmental changes. Managing these variables requires sustained attention and logical reasoning. In games like Civilization, players balance diplomacy, scientific research, and economic growth, while XCOM requires careful squad positioning and mission planning.
This genre encourages analytical thinking and decision-making because players evaluate both immediate and long-term consequences within the game environment.
Interactive Learning Through Gameplay
Gaming environments present information through interactive systems that encourage players to participate actively in the experience. Players observe patterns, evaluate outcomes, and adjust their decisions as they progress through the game.
The research discussed in Frontiers in Psychology emphasizes how gameplay activities involve multiple cognitive processes at the same time. Visual attention, memory recall, and strategic planning often occur simultaneously as players interact with a game.
The Guardian’s analysis of gaming data also highlights the importance of engagement. Players remain focused because gameplay systems deliver continuous feedback that responds to each action.
These interactive qualities make gaming environments well-suited for individuals who enjoy learning through exploration and experimentation.
Gaming as a Tool for Lifelong Engagement
The growing body of research on gaming continues to highlight how interactive experiences stimulate cognitive activity. Strategy games encourage planning and analysis. Puzzle games focus on logical reasoning and pattern recognition. Action games strengthen observation and quick decision-making. Online casino games introduce probability-based thinking and adaptive gameplay strategies.
For lifelong learners who enjoy exploring new challenges, gaming offers a dynamic environment that encourages active participation. Each genre provides a different set of cognitive tasks that stimulate observation, analysis, and decision-making.
As digital technology continues to evolve, gaming platforms will likely expand the range of experiences available to players. Interactive design, visual storytelling, and responsive gameplay systems continue to shape how people engage with digital entertainment.
Gaming remains one of the most interactive forms of media available today. Through its combination of visual stimulation, strategic thinking, and problem-solving, it continues to provide engaging opportunities for lifelong learners who enjoy exploring new ideas and challenges.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
In collaboration with Learning Genie: All Opinions are my own
If there’s one thing I value in education, it’s authentic and honest conversations about what’s really happening in classrooms. The January and February Learning Latte meetups with Learning Genie were exactly that.
These meetups offered grounded, reflective discussions about teacher preparation, real classroom challenges, and how tools like Learning Genie can support, rather than replace, our professional judgment. And with a focus on UDL, Portrait of a Graduate, and Differentiation, Learning Genie offers everything in one solution!
Here are some takeaways:
January: Teacher Preparation, TPA Season & the “Idea Inventory”
January’s Learning Latte meetup focused on the importance of and value in truly listening to educators.
One of the most important parts of the conversation came from Robert Mayfield, who addressed a challenge that many of us have seen and experienced firsthand: pre-service teachers during the TPA season.
If you’ve worked with student teachers, you may notice the impact of getting started and how they feel about it. They can be:
Overwhelmed
Time-strapped
Focused on and worried about meeting rubric requirements
Relying heavily on pre-existing lesson plans
Trying to survive and balance all of the new tasks that come with our work.
Robert highlighted a key concern: When pre-service teachers rely too heavily on ready-made lessons, they may miss the opportunity to build their own instructional toolkit. That’s where the concept of an “idea inventory” comes in.
What Is an Idea Inventory?
An idea inventory is not just a folder of saved lessons over the course of the school year or years. It is a curated, reflective collection of strategies used, activity ideas, differentiation techniques, assessment approaches, and adaptable frameworks.
The inventory includes:
Multiple entry points for learners
Flexible scaffolding ideas
Variations for different readiness levels
Culturally responsive examples
Developmentally aligned strategies
All of this is especially critical in early childhood and elementary settings, where differentiation is foundational.
The January discussion reinforced what I have noticed when working with other educators. New teachers need to understand how to differentiate effectively and have the resources they need to support their work.
This is where Learning Genie can make an impact. It supports reflective planning and enables teachers to connect observations to instruction. It makes differentiation visible, which is essential.
A good question to consider is: “How do we help future teachers think like designers of learning?”
Learning Genie supports that mindset shift. When teachers reflect on student observations and use those insights to plan intentionally, it helps build professional capacity and confidence. And it builds community when educators and companies connect!
Enjoy learning from and sharing feedback with Dr. Gene Shi
February’s Learning Latte offered a clear view and many insights into a lived classroom experience.
February’s meetup included educators Sandy Ferguson and Gina Ogilvie. Sandy began by sharing classroom experiences, grounding the conversation in real practice rather than theory.
I always want to know the stories of other educators, the why behind the choices in activities, strategies, and tools used in their classrooms, and the impact.
Many conversations about edtech center around the features, dashboards, and integrations. But I’ve long said and heard it in their message. What matters is the impact it makes inside the classroom.
Highlights from Sandy and Gina
Authentic Application The conversation centered on how Learning Genie supports educators’ daily work. It helps with lesson planning, documentation, and communication, and it is easy to navigate and use.
Alignment with Developmental Needs In early childhood, especially, the tools we use must align with how children learn best.
Teacher Confidence When educators feel supported in leveraging technology to provide meaningful and personalized instruction, their confidence increases. Teacher confidence impacts classroom climate and positively boosts student engagement and interest in learning.
What stood out is that technology works best when it amplifies teacher expertise, not when it replaces it. Shifting from replacement to the enhancement and transformation potential of these tools is important. And when it enhances our students’ learning opportunities. Check out this video to learn more.
Connecting January and February: A Common Theme
Both sessions highlighted:
The importance of reflective practice
The need for intentional differentiation
The value of building professional capacity over time
The role of tools in supporting rather than shortcutting professional growth
January focused on building the foundation by helping new teachers develop their idea inventory. February provided a clear view of what this looks like in action, with experienced educators using tools to refine their professional practice and deepen students’ learning impact.
Final thoughts
The best educational tools don’t give us answers. I think that they help us ask better questions.
How are we differentiating?
What patterns are we noticing?
How are we building our “idea inventory?”
How are we supporting new teachers before they burn out?
Use these questions as a focus point, and I think you will find that a tool like Learning Genie is a catalyst for transformational and meaningful instruction and learning.
Enjoy sharing about Learning Genie in Pittsburgh and other conferences and school PD sessions!
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Technology is evolving at a pace we have never experienced before. There have been so many changes in the world through artificial intelligence, automation, data science, and other emerging technologies. These are reshaping industries in real time. As an educator, I feel this shift daily, and I try to push myself to keep learning and looking for opportunities to do more for my students. The challenge is no longer simply preparing students for a job. It’s knowing how to prepare them for careers that may not even exist yet and also supporting them as they develop a variety of skills to be prepared.
When I think about how to prepare students for the uncertainty around the world of work, I look at insights from the World Economic Forum and its Future of Jobs research. While AI was listed as #3 for 2027 and is now listed as #1 for 2030, the other rankings reinforce what we already know: adaptability, analytical thinking, creativity, and resilience are becoming increasingly important in our world.
If we cannot predict the careers that will exist five or ten years from now, the best we can do is prepare students to be flexible thinkers, confident problem-solvers, and ethical technology users. And this is why I believe that career-connected learning is essential.
Redefining “Career Ready”
When I thought about “career ready,” I aligned it with strong academics plus essential skills of communication, collaboration, and the other “soft skills.” These are still relevant and necessary for success, however with the changes in technology, there are other areas that I believe must be addressed and become part of preparing students to be career-ready. remain foundational. Now, I include:
Digital and AI literacy
Ethical reasoning in technology use
Data awareness and cybersecurity knowledge
The ability to evaluate and question AI-generated information
Comfort navigating complex digital systems
Students need to understand how to use tools like generative AI. And that means using it to enhance and not replace their own learning. They can learn to brainstorm with AI, analyze outputs for bias or inaccuracy, and be able to recognize when human judgment must be at the forefront, providing consistent oversight. Research and interviews of employers have shown that employees will be expected to work alongside AI systems. That preparation has to begin in our classrooms from K through 12 and beyond.
Career-connected learning ensures students understand how what they are studying connects with real careers and real-world impact.
Why This Matters Now More Than Ever
According to projections highlighted by the World Economic Forum, millions of roles will be displaced due to automation, while millions of new ones will emerge. This is not the first time. More than 100 years ago, thousands of traffic light controllers in New York were displaced due to automation. They did not all lose their jobs, some shifted into others. And many of these new positions demand higher-order thinking, digital agility, and ethical decision-making.
I like to talk about some career options that minimally existed a few years ago:
AI prompt engineer
Ethical technologist
Data privacy consultant
These are some of the many growing fields of work and some which are increasing because of AI. I think about how we are preparing our students and believe that career-connected learning will help to show the connections between classroom content and workforce relevance. I also believe this is something that can be done in every classroom and in all content areas.
What Does Career-Connected Learning Look Like?
Career-connected learning is more than occasional career days. It is something that is embedded into daily instruction, not an extra element. It can include a variety of possibilities, such as:
Project-based learning connected to community or industry challenges. (Builds relevance for students).
Integration of AI, data science, and emerging technologies
Authentic problem-solving rooted in real scenarios
Partnerships with local businesses, universities, or nonprofits
Coding, AI, and cybersecurity challenges
Through opportunities like these, we can foster the development of student agency. When students understand how what they are learning connects to real opportunities, it sparks curiosity, increases students engagement and motivation. Learning is more purposeful, authentic, and meaningful.
Some ideas:
Artificial intelligence is an area that students need to understand. They need to know, how AI systems function, how to evaluate the outputs, how bias can be embedded, and what the ethical responsibilities are for using AI. In career-connected classrooms, AI might be used to discuss and explore how the legal field, healthcare and business industries, and schools are using AI tools. They can engage in role-playing that focuses on ethical decision-making. The goal is for students to leverage AI as a partner, rather than a replacement in learning.
STEM is a great option to focus on career-connected learning. In my own classroom experiences, I’ve seen what happens when students combine AI tools with engineering design, language learning, and problem-solving. When students train image classifiers and then collaborate, problem-solve, and evaluate where the model fails, they are not just learning about the technology, they are developing skills in critical analysis and bias detection.
Cybersecurity is another area that is seeing tremendous growth. Students need to understand how their data is collected, protected, and in some cases, misused. There are hundreds of thousands of cybersecurity roles unfilled in the United States alone, yet many students and perhaps even educators, have not heard of careers such as a threat analyst or a security operations engineer. Lessons on cybersecurity can be done in all classes. Here are some examples that I have shared:
English: Analyze phishing emails as persuasive writing
With all of the technology, especially with AI and automation, we have to keep focused on what makes us uniquely human. Technology will continue to evolve, even faster than it has been. But empathy, integrity, resilience, and collaboration will always matter and we need to make sure that students develop these skills.
With career-connected learning opportunities, we will prepare students for success in the future, even in careers that don’t exist. We will offer opportunities for them to discover their interests and purpose and be prepared to embrace the changes they will encounter and be successful.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Guest post by Dr. Torrey Trust and Dr. Robert Maloy
Welcome to “Students, Teachers, and Chatbots!” In this monthly series, you will find classroom-ready learning plans to use as you explore different civic engagement issues and topics with students. Each learning plan is connected to one of the ISTE (International Society for Technology in Education) Standards for Students.
Agency for learners means each individual is actively involved in what is happening educationally and instructionally in classrooms and schools. Agency, however, is more than paying attention in class, completing assignments on time, and earning high grades on tests. Agency also means students believe they have a voice and choice in what and how they are learning. They believe they can take actions in their lives based on what they are learning in schools.
In social studies education, agency is connected to civic education, and by extension, democratic teaching in democratic classrooms. Teaching about democracy is a cornerstone of civics education, where students learn the foundations of government of the people, for the people, by the people. Democracy offers everyone a voice and choice in making decisions collectively and collaboratively. In theory, the same is true in democratic classrooms. Yet, in the past three decades, the practice of democratic classrooms has faded from view. In school after school, standardized achievement exams have brought with them greater emphasis on teacher control and accountability, large group instruction, and teaching to the test (Ravitch, 2016).
In the current era of mandated curriculum frameworks and high-stakes testing, learning about democracy in many classes is focused on memorizing the branches and structures of national, state, and local government; reviewing the history of the American Revolution and other signature events in U.S. history; and learning the names of well-known historical figures. Democracy is rarely a lived experience for students.
When we asked college students, “What do you remember was your first experience with democracy?” many responded with puzzled expressions. When we clarify that by “first experience with democracy,” we mean when did they first recall thinking they had personal agency, that their voice mattered, that they were part of a collective decision-making process, most recall voting for the first time. But, when pressed to think back to when they were younger, some recall experiences with democracy in family meetings where adults and children shared ideas and made plans; at summer camps and recreation programs where campers had choices about playtime activities; in libraries where young readers made choices about what books to read; on sports teams where coaches let youngsters try many different positions and choose the ones they found engaged them the most. Those we spoke with so valued these experiences because they felt their choices mattered and decisions were respected, if not always agreed to by the adults in charge.
In the following bonus learning plan from our AI and Civic Engagement book, student agency is front and center – students are encouraged to research, design, and work together to create real change that is meaningful to them and their schools.
Chapter 9 (Global Collaborator)
Bonus AI-Enhanced Learning Plan: AI Literacy for All: Collaboratively Crafting an AI Curriculum for Your School
Student Engagement Question: How do you think we should be using AI in our classes and school?
AI technologies play a significant role in the lives of teachers, students, administrators, families, and community members everywhere. As the latest GenAI tools, models, and features are released, all of us are learning more and more about the possibilities and complexities of artificial intelligence and its place in education.
Elected officials and policymakers have ideas for what needs to be done for AI in education. The White House Office of Science & Technology under President Biden issued “A Blueprint for an AI Bill of Rights.” The European Union urged developers and users to ensure a safe, secure, and trustworthy AI. Lawmakers in Congress have introduced the AI Literacy Act, intended to address the reality that “communities most often negatively impacted by AI-enabled technologies often have the least access to AI education” (In section 2: Findings). One group of researchers from the National Education Policy Center has urged a pause in the use of AI tools in schools to give everyone time to develop guidelines and regulations about their use for in-person and online learning (Williamson, Molnar, & Boninger, 2024). Organizations, including Common Sense Media and OpenAI, are working together to create AI education guidelines (Kelly, 2024).
But, what do students think about the role of AI in their education? Should they have opportunities to use GenAI in every class, subject, and topic? Should they learn about the ethical issues surrounding the design and production of GenAI tools (e.g., hallucination, bias, environmental labor impact, exploitation of human labor, intellectual property rights)? Should they have opportunities to build their AI-Ready workforce skills?
This learning plan invites students to ensure their voice is heard when it comes to AI in their education. As global collaborators, they can work with others to develop an AI curriculum for their class, school, and/or district.
Learning GoalStudents will collaboratively draft an AI curriculum for their class, school, or district.
ACTIVITY 1: Research AI Curriculum Models and AI Literacy Frameworks/Models with GenAI
Invite students to curate a collection of AI curriculum frameworks, AI literacy frameworks and models, and any other resources and materials that can help them design an AI curriculum for their school or district. GenAI technologies can be a starting point for the research:
Example Prompt: “Create a table of at least 20 AI curriculum frameworks, AI literacy frameworks/models, or other sources to help me build an AI curriculum for my school. Make sure to include research-based frameworks and models. Include the name of the resource (column 1), a brief description of it (column 2), a description of why I should use it as a model or resource for my school’s AI curriculum (column 3), and a link to external sites to learn more information (column 4).”
Ask students to select at least 5 resources from their curation to critically examine and annotate, using the following AI-generated questions to guide their thinking:
What is the stated purpose or goal of this framework or resource?
Who created it, and what expertise or perspective do they bring (e.g., educators, technologists, policymakers, researchers)?
Missing Perspectives: Whose voices are missing from the authorship or the examples used? (e.g., Global South perspectives, Indigenous data sovereignty, non-corporate viewpoints).
What definitions of “artificial intelligence” or “AI literacy” does it rely on? How does this shape the rest of the resource?
What big ideas, concepts, or competencies does this resource emphasize that you think should appear in your school’s AI curriculum? Why?
What specific AI definitions, skills, or knowledge domains does this resource identify as essential? Which of these are non-negotiable for your specific student body?
Who is left out by this framework? Does it require expensive hardware, high-speed internet, or prior coding knowledge that your students may not possess?
How does the resource address ethical, societal, or environmental implications of AI? What elements of this should be included in your curriculum?
Does the resource treat AI as a standalone Computer Science subject, or does it offer strategies for integrating AI literacy into multiple subjects and classes?
What does this resource do exceptionally well? How does it contribute to an informed, balanced, or future-ready AI curriculum?
What is missing from the resource that is important for your school’s context (e.g., student diversity, local community needs, digital divide, civic engagement)?
How well does this resource align with your district’s mission, values, or current technology curriculum?
What adaptations would you make to this resource to ensure your curriculum is inclusive, engaging, and accessible to all learners, including multilingual learners and students with disabilities?
How does this resource compare to the other frameworks you selected? Where do they overlap or diverge?
Then, ask students to work in groups and design their own AI curriculum for their class, school, or the district.
ACTIVITY 2: Collaboratively Design an AI Curriculum with GenAI and School/Community
Ask students to use a collaborative technology to get feedback on their AI curriculum from family members, community members, and educational leaders.
They might do this by sharing their AI curriculum in a Google Doc with commenting features on and asking others to add their thoughts/ideas/suggestions/questions as comments throughout the document; or they could share a link to their AI curriculum document and provide a virtual space like Padlet or IdeaBoardz to collect feedback and ideas.
Then, have students, in their teams, review the feedback they received and make revisions to their AI curriculum.
Ask students to present their AI curriculum to the entire class and get feedback from their peers.
Then, as a class, vote on one curriculum (or multiple curriculums that can be merged into one) to send to the school leadership as an official proposal.
REFLECTION QUESTIONS
What role do you want AI to play in your schooling? Why?
Do you want AI to be taught as a standalone topic/class? Why or why not?
What learning opportunities do you need in school to confidently navigate the Age of AI?
AI LITERACY QUESTIONS
What are the arguments in favor of or against establishing an AI literacy or AI education graduation requirement for students at your school or in your state?
What AI ethical issues did you include in your curriculum? Why did you include those issues?
ISTE Global Collaborator Criteria Addressed:
1.7.b Multiple Viewpoints. Students use collaborative technologies to work with others, including peers, experts or community members, to examine issues and problems from multiple viewpoints.
1.7.c Project Teams. Students contribute constructively to project teams, assuming various roles and responsibilities to work effectively toward a common goal.
1.7.d Local and Global Issues. Students explore local and global issues, and use collaborative technologies to work with others to investigate solutions.
Torrey Trust, Ph.D., is a Professor of Learning Technology in the College of Education at the University of Massachusetts Amherst. Her work centers on empowering educators and students to critically explore emerging technologies and make thoughtful, informed choices about their role in teaching and learning. Dr. Trust has received the University of Massachusetts Amherst Distinguished Teaching Award (2023), the College of Education Outstanding Teaching Award (2020), and the International Society for Technology in Education Making IT Happen Award (2018), which “honors outstanding educators and leaders who demonstrate extraordinary commitment, leadership, courage, and persistence in improving digital learning opportunities for students.” More recently, Dr. Trust has been a leading voice in exploring GenAI technologies in education and has been featured by several media outlets in articles and podcasts, including Educational Leadership, U.S. News & World Report, WIRED, Tech & Learning, The HILL, and EducationWeek. www.torreytrust.com
Robert W. Maloy is a senior lecturer in the College of Education at the University of Massachusetts Amherst, where he coordinates the history teacher education program and co-directs the TEAMS Tutoring Project, a community engagement/service learning initiative through which university students provide academic tutoring to culturally and linguistically diverse students in public schools throughout the Connecticut River Valley region of western Massachusetts. His research focuses on technology and educational change, teacher education, democratic teaching, and student learning. He is co-author of AI and Civic Engagement: 75+ Cross-Curricular Activities to Empower Your Students, Transforming Learning with New Technologies (4th edition); Kids Have All the Write Stuff: Revised and Updated for a Digital Age; Wiki Works: Teaching Web Research and Digital Literacy in History and Humanities Classrooms; We, the Students and Teachers: Teaching Democratically in the History and Social Studies Classroom; Ways of Writing with Young Kids: Teaching Creativity and Conventions Unconventionally; Kids Have All the Write Stuff: Inspiring Your Child to Put Pencil to Paper; The Essential Career Guide to Becoming a Middle and High School Teacher; Schools for an Information Age; andPartnerships for Improving Schools.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.
Throughout the country, states and districts are taking different approaches to student cell phone use. Some have implemented complete bans, while others are leaving the decision to individual schools or educators.
What I’ve learned over the past 12 years of using devices in my classroom is that while policies can help create structure, they don’t build consistent digital habits. Digital wellness has to be taught, modeled, practiced, and reflected upon.
Why tech habits matter
With so much access to technology, we need to guide students in developing good digital habits. Digital wellness involves helping students understand when technology is helpful, when it becomes draining, and how to make intentional choices that will keep them balanced and present. Cell phone bans and updated device policies have been designed to promote digital wellness in our schools.
I’ve observed that in schools with cell phone bans, students are more interactive with one another, and their socialization skills are improving. For some students, knowing where their phone is and having it close by is important, and I can relate. But I also understand the importance of disconnecting and being present in the moment, especially in our classrooms, to be more focused on learning.
I have done a variety of activities with students and educators focused on digital habits. In one of them, I focus on the “benefits” and “drains” of devices. A simple way to start is with activities that help students map their “digital day.” Ask them to list all the ways they use their phone or other devices from morning to night. Next, have them decide when the use helps learning (taking a photo of notes, defining or translating a word, keeping time, conducting research, or even recording a podcast draft) or benefits their well-being (such as tracking steps, doing meditation, or using focus apps). They then identify when it is draining (doomscrolling or game-playing; checking notifications; causing reduced energy, lack of attention, or mood changes).
Continue reading the rest of my article on Edutopia.
About Rachelle
Dr. Rachelle Dené Poth is a Spanish and STEAM: What’s Next in Emerging Technology Teacher. Rachelle is also an attorney with a Juris Doctor degree from Duquesne University School of Law and a Master’s in Instructional Technology. Rachelle received her Doctorate in Instructional Technology, with a research focus on AI and Professional Development. In addition to teaching, she is a full-time consultant and works with companies and organizations to provide PD, speaking, and consulting services. Contact Rachelle for your event!
Rachelle is an ISTE-certified educator and community leader who served as president of the ISTE Teacher Education Network. By EdTech Digest, she was named the EdTech Trendsetter of 2024, one of 30 K-12 IT Influencers to follow in 2021, and one of 150 Women Global EdTech Thought Leaders in 2022.
She is the author of ten books, including ‘What The Tech? An Educator’s Guide to AI, AR/VR, the Metaverse and More” and ‘How To Teach AI’. In addition, other books include, “In Other Words: Quotes That Push Our Thinking,” “Unconventional Ways to Thrive in EDU,” “The Future is Now: Looking Back to Move Ahead,” “Chart A New Course: A Guide to Teaching Essential Skills for Tomorrow’s World, “True Story: Lessons That One Kid Taught Us,” “Things I Wish […] Knew” and her newest “How To Teach AI” is available from ISTE or on Amazon.
Contact Rachelle to schedule sessions about Artificial Intelligence, AI and the Law, Coding, AR/VR, and more for your school or event! Submit the Contact Form.
Follow Rachelle on Bluesky, Instagram, and X at @Rdene915
**Interested in writing a guest blog for my site? Would love to share your ideas! Submit your post here. Looking for a new book to read? Find these available at bit.ly/Pothbooks
************ Also, check out my THRIVEinEDU PodcastHere!
Join my show on THRIVEinEDU on Facebook. Join the group here.