Should CFOs be leading generative AI adoption?

Should CFOs be leading generative AI adoption?

/ /

Does the term autonomous finance ring a bell—or maybe an alarm?

If you were at the 2023 Gartner CFO and FInancial Executive Conference in Maryland last week, you’re likely familiar with the phrase, which (like autonomous vehicles) paints a picture of a future where humans aren’t steering.

According to Mark McDonald, a Gartner senior director analyst who presented throughout the event, two out of every three CFOs forecast that autonomous finance will be common within six years.

For any of this to materialize, however, stakeholders across organizations—from leadership to project teams—will need to embrace artificial intelligence (AI). What makes this such a tall order is that AI—specifically generative AI tools like ChatGPT—is viewed from wildly different vantages depending on your role, with many employees feeling threatened by AI’s evolving capabilities. 

In fact, 70 percent of employees believe AI will take their job, McDonald declared at the event, which isn’t surprising when you consider how many business leaders have introduced AI into the workflow to date. 

“We’ve got leaders who want AI to do everything, and employees don’t want it to do anything,” McDonald said, adding that “AI is not a project at all—it’s a competency.”

Frame generative AI as a tool—not a peer

McDonald went on to explain that leaders need to temper their expectations while managing their approach when onboarding new solutions, making sure not to present generative AI “projects” that appear to workers as training their replacement

Rather, McDonald and other Gartner analysts emphasized throughout the event that generative AI adoption should be a piecemeal process, introduced to teams as a work aid—not a replacement or even a new colleague—to help them focus on other priorities and improve productivity. (TL;DR: AI is here to help!)

But why does this start with the finance team? And what makes the CFO uniquely suited to lead the charge—and aren’t they concerned about the implications of “autonomous finance” on their job prospects?

“CFOs should have a larger role in generative AI than they have in other technologies,” Rajesh Kandaswamy, a VP AI analyst at Gartner, told the conference. His reason being that the CFO has the best handle on the company’s sources of revenue, its largest cost drains, and best opportunities for cost savings and profits. 

Through this lens, the CFO can simply make a more informed decision about implementing generative AI than IT staff, adding that “the CFO is probably in the best position to educate the company to start thinking strategically, not the CIO,” Kandaswamy said.

Focus less on the autonomous aspect, more on the outcomes

Even with informed leadership, however, Kandaswamy and his peers warn that any AI adoption is rife with risks, and no company should take the “autonomous” label too seriously. This is especially true in finance, as generative AI algorithms shouldn’t be blindly exposed to proprietary or confidential business data, for instance. 

There’s also the broader risk of copyright infringement, “deepfake” misinformation and recorded instances of generative AI tools outright lying to human practitioners, which all emphasize the need for more human intervention as these tools join the IT stack, not less. 

This is all emphasized by the fact that technologists behind many of the most popular generative AI applications have been honest about their limitations and drawbacks. ChatGPT-4, for instance, is less than 80 percent accurate in a business context, according to findings from OpenAI—the company behind ChatGPT—published in March.

Task generative AI with low-risk copy to start

So where should CFOs start when bringing AI into their business? According to Gartner experts, “anywhere they have content generation that is of low risk,” including sales memos, emails and as a starting point for website copy—caveat being for all of this that AI is only used as a starting point, with humans stepping in to finalize any drafts.

It’s also imperative that there is a senior leadership official who is fully accountable for the technology, whether that’s the CFO or Financial Director themselves, or a designated AI manager who can enforce a responsible AI policy. 

What should responsible AI policy look like? While the full list of requirements can quickly become exhaustive, some pillars outline by Gartner include; technology serves a broad range of stakeholders; mines high-integrity data; protects against attacks and rogue use; shields user data; avoids harming people, property or the environment; and is explainable, transparent and reliable.

At Boast AI, our human expertise is our greatest asset, as our R&D and tax pros leverage the latest and greatest technologies to streamline claim creation for our partners while maximizing their access to non-dilutive funding. 

To learn how Boast AI can work for you, schedule a call today

Boast

Boast Logo