If there is a topic that has dominated independent schools this past year, it is the use of artificial intelligence. This still emerging technology and its application for students, faculty and administration seems to grow exponentially each day. It offers exciting new opportunities but left unchecked, it could pose a substantial threat to the very fabric of our educational missions.
For this reason, earlier this month, hundreds of attendees tuned into NBOA’s town hall-style webinar on managing AI risk within independent schools. While I don’t want to give away all the good stuff, I can’t resist the opportunity to lift up a few key takeaways. (In case you missed it, NBOA members have access to all NBOA webinar recordings, including this one, as part of their school’s membership.) Longtime NBOA friend and colleague, Brad Rathgeber of One Schoolhouse, NBOA's online learning partner, moderated a discussion that featured Sarah Hanawald of One Schoolhouse, Alex Podchaski of Trinity Preparatory School of Florida and A. J. Zotolla of Venable LLP. The triumvirate of an independent school tech pioneer, longtime independent school technology director, and legal counsel helped illustrate the complex nature of AI and the multiple lenses schools should don to support AI’s use and mitigate its exposure to potential risks.
It’s my understanding that while some schools have developed an AI policy for students, faculty and staff, many have not. This was one aspect of the deep dive session on AI at the 2024 NBOA Annual Meeting — and will be part of an upcoming issue of Net Assets magazine landing on your desks this summer. For school leaders considering the development of such a policy, a good place to start could be Hanawald’s three key questions about AI use in school:
- What are the tasks that AI cannot or should not do?
- What tasks can AI help with by augmenting faculty and staff members’ actions?
- What tasks can be largely trusted to AI?
One piece of good news is that the cybersecurity and privacy standards that you’d look for in any agreement with a digital vendor will largely apply to AI as well, so much of what you already know can be put to use here.
During the NBOA Annual Meeting deep dive session, ATLIS Executive Director Christina Lewellen presented the framework of Lake Highland Prep’s policy, which similarly divided actions into categories of “red light,” “yellow light,” and “green light” for AI use – some tasks you should never do, some you could do with caution and others which should be fine regardless of circumstance.
Beyond establishing a policy for your school, another key takeaway was reading the policies of the tools you’re using. This could be for a new AI-powered tool, or for tools you have been using that have AI newly built into them. While reading terms and conditions is never anyone’s favorite job, it is one the business and technology offices can do well – and it is important. The town hall panelists noted that long-established companies within the educational space will tend to have better privacy and security protections in place than a small AI start-up that has not had years to conform their product to existing rules and expectations. One piece of good news is that the cybersecurity and privacy standards that you’d look for in any agreement with a digital vendor will largely apply to AI as well, so much of what you already know can be put to use here. This follows the long-held business officer mantra: “Don’t reinvent the wheel.”
Neither the most zealous AI user on faculty nor AI's most vocal detractor should be the sole voice in the policy; rather the policy should be aligned with the needs of your school’s academic program and student needs, with faculty input.
Panelists also talked about reigning in the use of tools across the school to ensure they are secure and streamlined. Redirecting employees to a smaller approved set of tools may require some adjustment, but can help manage risk tremendously. Moreover, neither the most zealous AI user on faculty nor its most vocal detractor should be the sole voice in the policy; rather the policy should be aligned with the needs of your school’s academic program and student needs, with faculty input. To win buy-in for these kinds of organizational changes, it helps to have a team comprised of various departments making decisions, so disparate parts of the school have a voice. We’re taking that advice at NBOA, as our own AI working group includes representatives from administration, programs and marketing-communications.
It is indeed an interesting time to work in education. As we harness the power of AI tools with so much promise, we simultaneously must take the necessary steps to protect privacy and intellectual property, which are often at the heart of our school’s value proposition. Business officers often have to ask “What if…?” There is no better time to be posing these provocative questions as we venture into the unchartered territory of AI, to provoke critical thinking while supporting the innovative use of valuable tools that can enhance learning in service to our students and families.
Follow NBOA President and CEO Jeff Shields on LinkedIn.