K-12 Leaders on the Impact of Artificial Intelligence and Generative AI
While artificial intelligence and machine learning are not new technologies, recent leaps in the technology driving these tools are rapidly transforming our day-to-day lives. From sophisticated software programs that amazingly interpret keystrokes before they are made to financial institutions predicting purchasing habits to tools like ChatGPT creating sophisticated marketing materials, AI, or rather, generative AI, has burst onto the scene seemingly overnight.
As is the case with any transformative technology, an ethical debate has ensued around best practices and the impacts (positive and negative) it may have on society.
In the edLeader Panel “Artificial Intelligence and Generative AI: Empowering a Deeper Conversation,” three innovative superintendents and the executive director of the Indiana CTO Council discussed the challenges of effectively navigating the use of AI and generative AI in K-12 education. The group shared guidelines from the CoSN/AASA EmpowerED Superintendents Initiative, which were created to help school leaders implement policies to address AI/generative AI in schools.
Guidance from EmpowerED Superintendents
The purpose of the EmpowerED Superintendents Initiative is to provide support to superintendents through collaborative opportunities and discussions to better equip them to not only understand emerging technologies but also help them make informed decisions about technology investments that best align with the needs of their districts.
With ongoing debates around AI and generative AI taking center stage in K-12 education, the initiative recently published a resource guide, Artificial Intelligence & Generative AI: Empowering a Deeper Conversation, which was created under the leadership of Pete Just, a former administrator who served as a chief operations officers and chief technology officer in two school districts. He is also a former CoSN board member and is now Executive Director of the Indiana CTO Council.
Understanding the Differences Between AI and Generative AI
Technically, AI “is the intellect shown by machines which is based on the synthesized and predictive inferences of the information with which they are trained.” In layman’s terms, Just said AI is “really anything where we’re using some computational tools to try to do some aspects of what humans do.” In general, AI is focused on conducting one function based on specific programming information.
Generative AI takes artificial intelligence one step further by using “algorithms to produce new data, often in the form of image or text.” Just explained that the operative word is “new.” Generative AI develops something new based on the question asked or the prompt that is given.
7 Essential Guidelines for K-12 Leaders
The debate surrounding AI and generative AI in K-12 education is not whether it should be used, but rather how it should be used and the ethical standards that should be applied when it is used. The Artificial Intelligence & Generative AI resource guide features seven essential guidelines for K-12 leaders:
- Awareness: Ensure that users are aware of AI tools and the potential benefits for K-12 education.
- Limitations: Explain the limitations of AI tools and the potential for errors or inaccuracies.
- Ethics and Etiquette: Promote good online etiquette, including proofreading and fact-checking.
- Ongoing Training: Provide ongoing innovation training and reinforcement on the best ways to use AI tools.
- Reporting: Educate the school community about how to report incidents or concerns.
- Policies: Set policies to create a culture of safe and responsible use.
Awareness: Learning Together through Open Communication
Generative AI has been moving at such a breakneck pace that no one has all the answers right now. Matthew Friedman, Ed.D., Superintendent of Schools for Quakertown Community School District (PA), Kelly May-Vollmar, Ed.D., Superintendent of Desert Sands Unified School District (CA), both agree that building time for professional development opportunities to learn about generative AI tools is key to creating comfort with the technology while everyone learns about it together.
May-Vollmar explained that with anything new, there is usually a level of fear, and the initial reaction is to ban it or remove it. However, this is unrealistic because AI is not going away, so creating “an arena where we’re all learning together is a great first step because it takes away some of that anxiety ‘that I need to know everything about it’ before I talk to you about it.”
Friedman stressed the importance of broad community support, advising leaders “to be transparent and inform. [They] need to talk to a lot of different stakeholders, from classroom teachers to administrators to parents and to students.” In his district, they do that through professional development opportunities, parent forums, and by educating students on the available tools and what is and is not acceptable use inside the classroom.
Limitations: An Opportunity to Teach Critical Thinking Skills
With the impressive advances made with generative AI, some people have been led to believe that generative AI tools are inerrant, like the recent case where a lawyer used ChatGPT to create a legal filing that cited fake cases. As that case demonstrated, there are limitations that include inaccuracies and errors.
David Miyashiro, Ed.D., Superintendent of Cajon Valley Union School District (CA), shared that his recent experience implementing a generative AI tool in his district underscored the need to recognize these. As part of their work with SchoolJoy, a personalized learning platform that utilizes AI-powered tools to improve student engagement, he was introduced to the term “hallucination,” which he said is a technical term “for when generative AI produces false information as if it were true. And it does that not because it’s trying to be disingenuous or to be wrong. It doesn’t know better.” (SchoolJoy educated the district on hallucinations and how they happen.)
In short, a generative AI tool can “hallucinate” because it is only as good as the information that it receives, which means that teaching students to become critical thinkers who understand how to fact-check their information is more important now than ever.
Ethics and Etiquette: Digital Citizenship and Equity of Access
Fact-checking, proofreading, and discernment are necessary for addressing limitations in generative AI, but May-Vollmar believes that teaching students how to be good digital citizens is essential. She noted, “It’s really important that they understand the potential impact that AI can have on society, on the economy, and that we’ve got to train them to be good, informed decision makers as they’re choosing how to use AI.”
Students not only need to know how to use AI tools, but they also need to understand the implications of using these tools. For Miyashiro, access to AI for students is an ethical issue that addresses equity and fairness. He said, “In the beginning when we saw some districts completely shut it down, that was unethical behavior because, just like the kids with devices and ubiquitous access to WiFi have a greater advantage than their underserved, underprivileged peers, this technology is going to exacerbate that gap.” Ensuring that all kids, not just kids from affluent backgrounds, have access to these tools is an ethical decision that school districts need to consider.
Ongoing Training: District Integration
To best support teachers, administrators, and support staff, Just recommends integrating AI training into a district’s existing ongoing training protocols. In Friedman’s district, they were intentional about dispelling the myths that surrounded AI—that AI is going to replace humans or completely change instructional practices. They gathered staff to share experiences about instructional practices, student use, and success stories. As they did this, they realized that as they learn together, they can grow together with this new technology.
Reporting: How to Address Incidents and Concerns
Since this era of generative AI is nascent, education leaders are challenged with putting measures in place to appropriately report incidents and concerns from students, parents, teachers, and the broader stakeholder community. The panelists agreed that this is a very new territory for school districts but in many ways, they already have protocols in place to address issues like privacy concerns, plagiarism, and other misbehaviors. As they consider reporting processes, district leaders should:
- Identify and define the concern.
- Determine what kind of private information is being exposed.
- Ask who is reporting the incident—parent, student, teacher, administrator?
- Define who should receive the report, or better yet, is there a chain of command based on the nature of the report?
Each district needs to determine how to address concerns when they arise. Most likely, they have these protocols in place for existing issues and can adjust those and apply them to generative AI.
Policies: Creating a Safe and Responsible Culture
As we enter this new era, it will be critical for school districts to establish policies to create a culture of safe and responsible use to mitigate the potential risks associated with using AI tools in a school environment while also iterating effective ways to leverage the power of generative AI that benefit both students and teachers.
In Cajon Valley, they have been fortunate to receive sample policies from the California School Boards Association as new things are being shared regarding FERPA. Many of the sample policies address the issues discussed above regarding access, academic integrity, ethical concerns, and privacy. While these samples are helpful, Miyashiro pointed out that they are entering new territory when it comes to content creation and the ability to “hyper-personalize” materials for students that align with their individual needs and interests. This ability will most likely impact education publishers and content creators the most, which will require the creation of new policies to address the personalization of curricula using AI.
Privacy and Security: Review with a New Lens
With the new privacy and security concerns raised by generative AI, it is important for districts to review their privacy and security measures through a new lens. May-Vollmar stressed that it is really important to look at identifiable information. She explained that “AI has the ability to do different things that we’re not accustomed to working with in the past, and so we really have to make sure that our policies are not so prescriptive that we have to change them constantly because there’s going to be continued privacy and security measures that come up as new technologies advance, as AI advances.”
Learn more about this edWeb broadcast, Artificial Intelligence and Generative AI: Empowering a Deeper Conversation, presented by CoSN and AASA, and sponsored by ClassLink.
Join the Community
Super-Connected is a free professional learning community for school superintendents, district leadership, and aspiring district leaders.
AASA is the premier association for school system leaders and serves as the national voice for public education and district leadership on Capitol Hill.
CoSN (the Consortium for School Networking) is the premier professional association for school system technology leaders. CoSN provides thought leadership resources, community, best practices and advocacy tools to help leaders succeed in the digital transformation. CoSN represents over 13 million students in school districts nationwide and continues to grow as a powerful and influential voice in K-12 education.
ClassLink is a global education provider of identity and analytics products that create more time for learning and help schools better understand digital engagement. As leading advocates for open data standards, we offer instant access to apps and files with single sign-on, streamline class rostering, automate account provisioning, and provide actionable analytics. ClassLink empowers 19 million students and staff in over 2,500 school systems. Visit classlink.com to learn more.
Article by Ginny Kirkland, based on this edLeader Panel