|   |   | 

DIY AI: Business Applications (Part 2)

AI has never been more accessible, and it will impact every corner of the independent school business office.

Aug 30, 2023  |  By Cecily Garber, NBOA

From the September/October 2023 Net Assets Magazine.

Artificial intelligence

The first part of this article on AI and the business office appeared in August and can be found online here: DIY AI: Key Considerations (Part I)

Data Ready

I can imagine a world where AI is very, very much a part of the fabric of schools,” said Eric Heilman, executive director of the Center for Institutional Research at Independent Schools (CIRIS) and senior institutional research strategist for the educational consulting group Mission & Data. New AI developments could have a particularly strong impact on the data analysis schools do or would like to do. But “All of that depends on having a much better data infrastructure than we currently have,” he said.

Private AI models similar to ones now open to the public can be trained with proprietary data sets, to protect privacy and accuracy, for example. “If you are training an [in-house, school specific] AI, you want it watching data that is updating frequently, perhaps in real time,” Heilman said. It could predict financial trends, for example, or help identify patterns in revenues and expenses.

Data at schools today is often siloed and inconsistently managed, however. This past summer, the CIRIS Lab worked on a guide to independent school data management, which is slated to be released in September (for more on this work and institutional research, see an interview with Heilman in “Voices from the Field." See also “Strategies” in this issue for a case study on better data management). Schools would do well to get their data in order now, not only to meet current best practices but also to prepare for AI products that may become available in the next couple years.

In the nearer term, new tools may help business officers generate graphs and other visualizations of their quantitative data that were more difficult to prepare before. The technology could “really lower the cost of entry for schools that want to dabble in more data analysis,” he explained. In a year or two there may be AI that directly generates a dashboard from a database, he surmised. And Heilman sees significant implications for qualitative data analysis, such as large school surveys. Coding and parsing large amounts of verbal or written data is hugely time consuming, but natural language models can do the basics in a fraction of the time.

Every Corner of the Business Office

Leaders in accounting, insurance, legal affairs and school database management are considering how AI will continue to impact business in schools and elsewhere. Here are some quick takes from business partners to independent schools.

Accounting

“Business folks in other industries are thinking about how they are being asked to do more with less than ever before,” said Carmel Wynkoop, partner at Armanino. Schools are in the same boat, albeit with fewer resources than large companies. But the beauty of newer AI technology is that it’s more accessible to organizations outside of the Fortune 500, she said. “The options are different now than they were two years ago.”

  • Automation: “All of us have been to some degree on the automation journey for a number of years,” said Rick Krueger, managing principal of service at CliftonLarsonAllen (CLA). “We’ve been considering how we can take processes that we’re doing very manually, remove some of the more mundane pieces of the work, and focus our professionals where they’re adding value to the process: the analysis, the conclusions, the decision making, the interactions with people — the part where it’s really important to have that person, less so the transferring of data and pure summarization.” Kathy Ferguson, partner at Armanino LLP would agree: Business officers should no longer be bogged down exporting data from one database and importing it into another, for example, and data entry should really be a thing of the past, she said. Unless the system at hand is a dinosaur, automation is possible with support.

Armanino starts by developing a roadmap that usually begins with the lowest hanging fruit, the easiest processes to automate that provide value. “Then as the tools build credibility and people start thinking of AI as their personal helper — which can work 24 hours a day, seven days a week, doesn’t call in sick and never has a bad day — we can build from there,” said Wynkoop. Never implement AI simply for the sake of it, she advised. New uses should address the biggest problems and opportunities and have a clear ROI.

In accounting in particular, firms are looking ahead to a potential workforce shortage. A predicted 100 million baby boomers will be retiring soon, while fewer than 40 million younger workers are poised to fill the gaps. “The only way we’re going to get around that is working smarter because we can’t work harder,” said Brenda Kahler, a director at Armanino.

  • Predictive analytics: Accounting firms are also able to provide services to aggregate data and offer predictive analytics in a way that previously could be cost prohibitive to smaller clients. For schools, that could include financial data, donor information or student records. “Business officers may be wondering, Am I really going to be able to do these things?,” said Ferguson. “I would reiterate that it’s not as complicated as you may think. You get experts to come in on the front end, help you set it up, and let the technology work for you.”
  • The audit and fraud detection: “Where I get really excited about what’s coming down the pipeline is using AI to identify risks,” said Krueger. “AI’s ability to read and interpret information seems to be exploding.” The data sets on which the newer AI models are trained have grown so much that “we’ve finally reached the threshold that it can understand human language and semantics at a much deeper level than we’ve ever had before,” explained Alexander White, data scientist manager at CLA. The only way an auditor can sift through the mountains of information at hand in a traditional approach “is by picking samples large enough to be considered representative of that population,” said Krueger. “That’s not the most targeted approach. AI on top of previously developed risk models can target [what we’re looking for] a lot faster.”
  • Analysis and strategy: Not only will auditors be able to digest larger amounts of data and do it more quickly than before, they will be able to produce analysis more quickly and ultimately provide more value, said John Toscano, principal and Independent School Segment leader at CLA. “There may be almost instantaneous KPI development and analysis and deep focus on interpretation more than the results of audit procedures,” he said.

Insurance

Like accounting firms, insurance brokers foresee a future where generative AI will allow both schools and firms to spend less time on mundane tasks and more on analysis, strategy and person-to-person interaction.

  • Forms: Filling out insurance applications is a necessary evil for obtaining coverage; it can be an onerous and time-consuming annual task for schools. Robert G. Riley, vice president at Fred C. Church, is optimistic that generative AI will help make that process faster. “While we work hard to make the annual renewal process as efficient as possible for clients, I am hopeful that AI will help create a system and process that makes the application information gathering component less cumbersome for schools.”
  • Information curation: For more than a hundred years, insurance firms have been gathering information and processing it to help clients make informed decisions about their coverage levels. That process can be labor intensive. Riley is hopeful that “AI can help do the administrative grunt work, and allow us as brokers and risk advisors to spend more time advising and strategizing with our clients.”

What the latest developments in AI have “been really good at is extracting data from documents and being able to structure it,” said Garrett Droege, senior vice president and director of innovation at IMA, a national retail insurance brokerage and now parent of Bolton & Company. Whereas the process of comparing an insurance quote, binder and policy for discrepancies and errors could take an employee hours, AI can do that in minutes, he said, and more accurately. He cited a test to spot errors between an AI company and a traditional broker; it took the broker weeks while the AI took a matter minutes, while finding a few more errors than the humans.

“Everyone’s suddenly going to be much better at everything,” Droege mused. He built extensive expertise in cyber risk and digital innovation over 20 years, but he surmises well-trained AI could be even better than he is, and certainly faster, in terms of researching and synthesizing trends, legal actions and past claims. But he’s not worried about his job; rather, the key is to harness the technology and find head space to do more engaging work, he said.

Similarly, the decision-making process around setting limits of coverage and deductibles is currently labor intensive. Generative AI may be able to provide information about high-end and average claims in a fraction of the time it takes a person to sort through the data, said Riley. The question is, of course, whether the information will be trustworthy. If it proves itself reliable, generative AI could even support a more forward-facing portal where clients could easily access the trove of information insurance companies collect themselves, pondered Riley.

All these developments would give brokers more time to interact directly with clients and educate them about risk management.

  • New risks, new products: “Insurance companies may be able to predict risk better, which could hurt or help depending on what’s going on,” said Riley. “And while AI will help in certain areas, it will inevitably present some risks as well.” In line with common insurance practice, “don’t be surprised if you see AI type exclusions pop up at some point.” Coverage may initially be denied for novel incidents until a new product is launched, similar to the development of cybersecurity insurance not long ago.

Insurance coverage for large AI models has already been established in healthcare, for example, said Droege. As the technology spreads, he expects insurance options to spread and concurrently, for more organizations to feel comfortable using and trusting the tools to be accurate and fair. AI model insurability won’t come in until hallucinations — that is, when AI makes up incorrect answers — are minimal; but once insured, there will be financial resources should something go wrong, he said.

Database Management

Companies that process school administrative information and effectively serve as databases would seem to be a top contender for AI innovation, considering their strength in rich datasets. Those developing the products, however, must be very careful to consider privacy, security and ethics when handling the sensitive data they store. “We are asking, How can we even safely experiment with these tools?,” said Brittany Wilson, director of information technology at FACTS. They do not want to share intellectual property with an open model, which digests the data fed to it and puts it back into the system, not to mention risk the privacy of the datasets they manage. “We’re focusing on leveraging AI internally first to do our due diligence and learn what some of the pitfalls may be without putting our clients at risk,” said Wilson.

But they will be changing and adapting. Before the advent of the most recent generative AI, “You had to do so much work to build your own models and, and train it on your data set and had to have so much available to you, like data scientists,” said Jeff Fraser, senior vice president of product engineering at Veracross. “Today [with the pretrained models] you can get started pretty quickly.” Thus all of the following are ideas, not what is currently implemented by systems today. And the experts interviewed noted that the landscape may change quickly, as it has over the past six months or so.

  • Customer service: With the adoption of any financial processing system comes a learning curve — be it families using it to make payments, school administrators using it to process information, and the companies themselves helping onboard new schools with their product. Generative AI could offer possibilities to have a more robust and responsive chatbot to help families and schools, said Wilson.

Veracross is also hoping to be able to better search its vast archives of documentation to help its representatives identify similar use cases and more quickly help schools solve issues they may be having. “Rather than it working like a Google search, like it does now, where you have to sort through examples and figure it out yourself, a generative AI search could process the information and return a direct answer,” said Jessica Wallis, senior vice president of product management at Veracross.

  • Analytics: While a product like Veracross already has data analytics capabilities built into it, it takes some practice and understanding to use it well, like most software today. Generative AI may allow users to type in questions using natural language, that is everyday writing, to produce analytics instead of users having to learn a product-specific process, said Fraser.

There is also potential to not only analyze historic data more easily, but tap into predictive analytics, be it in terms of likelihood to meet payments to student performance to donor trends to enrollment markets. Perhaps advanced data analysis using AI could identify trends to help schools should a potential recession hit, Wallis said.

  • Automation: Financial administration products are looking for ways to ease the burden of administrative work, like balancing accounts.

Legal and HR

“Anything you do as an organization that involves compliance, whether it’s financial reporting, providing health insurance to your employees, or hiring people, if you’re using AI to do any of those functions that you already did, you are still liable and bound by all the existing regulatory structure for those things,” said Brenda Leong, partner at bnh.ai, a law firm specializing in artificial intelligence issues. “People will say, there’s no AI law. Well, there’s not a lot of law about AI in particular, but there’s a lot of law that governs the thing you’re doing, and that’s something we have to make clear to a lot of people in a lot of different contexts and levels of experience and business functions.”

  • Vendor screening: As noted above, the EEOC released guidance this spring indicating that employers are liable for any discrimination that occurs when an employer uses an AI tool in processes, even if the tool claims to be bias-free. Tools in other areas, such as student monitoring, are also at risk of inaccurate or biased output. “We need to pay a lot of attention to these tools because they are adapting so fast,” Leong said. Tools monitoring student behavior, for example, don’t have a lot of good science behind them, she said. “Some are built to do things that cannot be done.” Whatever AI tool a school adopts, it should not be the only way the school monitors or judges a situation, she advised.

So how can schools take advantage of these tools without putting a target on their back? While specialists can perform audits, those are probably practical only for large companies. Schools that work with AI vendors should look for transparency from the vendor about how they are complying with laws and regulations, as well as claims about accuracy and fairness. Schools can also use a vendor screening tool with questions like, What data is the model trained on? How often is it updated? Another model to consider is “FAST” — does the tool offer Fairness, Accountability, Social benefit and Transparency?

“It does take some level of awareness and understanding to be able to do that well enough that you can be confident,” said Leong. She recommends everyone learn more about AI to be able to ask better questions and make better judgments about using or not using certain tools.

Schools can also build in processes that check for accuracy and bias from a human standpoint. If a school uses an AI-supported resume screener, for example, which recommends three top candidates, the school can spot-check applications that were cut to ensure the AI is making appropriate choices.

These are still early days for the understanding and adoption of this technology, and new developments, applications and risks are likely to appear by the week. But if anything, these examples from business partners may help schools spark their own thinking about how they might use AI themselves, particularly in the open formats now available — that is, use the new tools safely and accurately as schools, as students, faculty, staff and families navigate this brave new world together.


Author

Cecily Garber

Cecily Garber, Ph.D.

Associate Vice President, Communications and Member Relations

NBOA

Arlington, VA

Cecily Garber is the editor of NBOA's Net Assets magazine, and directs NBOA's publication efforts, which includes books, reports and industry guidance. She also oversees the communications and member relations team, which is responsible for all membership, marketing and communications efforts.