How to demonstrate ROI for your GenAI projects

In this article, we explore why it’s hard to evaluate the return on AI investments and how to identify metrics that can more accurately reflect ROI.

These days, companies need to show a return on investment (ROI) for basically any business expenditure. But for a still-newish technology like generative AI, knowing how to understand and demonstrate ROI can be a challenge. In a recent McKinsey report, 65% of global executives said their organizations were regularly using generative AI (GenAI). But even with widespread adoption and the topic dominating headlines and conference rooms, only a small percentage of AI projects actually make it into production.

Why is that? We’ve been hearing from our sales and customer success teams that there are no set metrics or universally-accepted standards for proving the ROI of AI tools and technologies yet. But that doesn’t mean AI isn’t having a positive impact, nor that there’s no way to demonstrate that impact.

There are two major arenas where our customers are reporting improvements attributable to GenAI: developer productivity and developer experience. AI-powered tools help developers automate toil and rework, improving both productivity and happiness/satisfaction at work. These tools also make it easier for developers to upskill with new languages or frameworks. While benefits like these can be hard to quantify in conventional ROI terms, they are nonetheless real benefits.

In this article, we’ll talk briefly about why it’s hard to evaluate the return on AI investments and how to identify metrics that can more accurately reflect ROI.

Why it’s so hard to assess ROI for AI

As of yet, there are no universally accepted metrics for how organizations should determine the ROI of AI technologies. A major reason why is that many benefits of AI, like productivity and employee satisfaction, have indirect rather than direct impacts on a company’s performance. Indirect effects are, of course, harder to quantify.

For example, AI-powered code generation tools can save developers hours of busywork, giving them more time and energy to focus on solving higher-order problems and projects that require creativity and innovation. Those developers tend to be happier as well as more productive. Happier, more productive teams can equate to faster time-to-market for new features—and, thereby, happier customers. Clear but intangible benefits like these are difficult to quantify in directly financial terms.

Other benefits organizations might see from AI include “increases in customer engagement, improvements in operational efficiency, [and] breakthroughs in more data-driven and accurate decision-making,” according to an article from cloud communication company Twilio. “These things can be huge for the success of your business but might not correlate directly to a number at the bottom of your quarterly report.”

The fact that GenAI remains a relatively new and quickly evolving technology also makes it harder to calculate ROI. A report from IBM’s Institute for Business Value revealed what its authors called “yawning outcome gaps across AI projects,” with most falling short of the financial outcomes shareholders are hoping for. The IBM report found that “average ROI on enterprise-wide initiatives is just 5.9%, well below the typical 10% cost of capital. In essence,” the report continues, “AI is following the ‘J-curve’ pattern typical for transformative technologies”: an initial loss, followed by a significant gain. In other words, it’s not entirely surprising that AI projects don’t seem to be delivering returns commensurate with the hype. It also suggests that those returns will grow over time.

How to get the full picture

To get a full picture of how AI projects are impacting your organization, you may need to look beyond traditional or financial ROI calculations. Instead, “companies need to identify metrics that capture both financial benefits and strategic outcomes—such as a better user experience, broader access to capabilities previously requiring higher skills, and employee and customer satisfaction,” writes Lucas Mearian for Computerworld.

Align AI projects with business priorities

The IBM report we cited above similarly found that “aligning AI with business priorities yields much higher ROI than ad hoc projects.” Their survey of 2,500 executives in 34 business and technology roles in 16 countries found that “organizations that view AI as important to their business strategy are 1.8 times more likely to be effective with their AI initiatives and achieve nearly twice the ROI” of their competitors.

One guide to measuring the ROI of AI projects suggests “[correlating] the model’s performance with specific business outcomes” through “proxy variables like customer satisfaction, productivity improvements, or new revenue enablement to showcase the impacts of AI projects.” Metrics like percentage of workflow automation, developer productivity, customer service improvements, and high adoption/engagement rates can reflect meaningful if difficult-to-quantify returns on your AI investments.

Param Vir Singh of Carnegie Mellon’s Tepper School of Business told Computerworld that a common mistake companies make is looking at the return on AI investments across all of an organization’s project, instead of zeroing in on each project individually. That approach also makes it more apparent which projects are working and which aren’t. “When AI is deployed in a specific place—say, employees’ access to Copilot to do a certain activity—then it’s easier to measure productivity gains,” said Singh.

Focus on developer happiness and productivity

Developer productivity and developer experience are the two areas where organizations are reaping the most rewards from their investments in AI tools, according to our conversations with customers. That certainly tracks with GitHub’s quantitative research into how GitHub Copilot, an AI code-generation tool, affects developer happiness and productivity: 88% of developers using Copilot reported that they were more productive, 74% felt able to focus on more satisfying work, and 87% reported less time spent on repetitive tasks.

One crucial caveat: AI code-gen tools alone won’t move the needle, at least not for long. As we’ve written, AI can generate code, but it lacks the judgment to determine whether that chunk of code will fit the need and work as intended. Nor can AI understand the range of possible input parameters and select the optimal algorithm for what you need. AI models don’t come out of the box understanding the historical context behind your architecture decisions or the particular requirements of your codebase—though they can make it easier for your engineering teams to understand those factors.

To realize the value of their AI investments, organizations need to ensure that employees continue using those tools after the launch and initial excitement. To support the adoption and ongoing success of AI tools that promise to boost developer productivity, you need a community of practice dedicated to sharing knowledge and best practices around these technologies.

GitHub COO Kyle Daigle told Computerworld that ROI “is baked into genAI code development because it reduces time to market, frees up developer time, and allows developers to focus more on creative tasks than menial chores.” The ROI may not be immediately striking or obviously attributable to AI projects, but over time, that J-shaped curve will start to emerge as projects scale, AI models are refined through higher-quality training data and human-in-the-loop engineering practices, and developers’ increased bandwidth translates to more innovative products delivered to market more quickly.

But as we suggested above, AI-powered tools don’t succeed without the human element. For these tools to deliver ROI, your organization needs knowledgeable people with a means of seamlessly capturing and communicating their knowledge to the community and to AI models.

Stack Overflow for Teams and our GenAI-powered add-on module, OverflowAI, are purpose-built to help you create and sustain a community of practice around your AI projects, along with other tools, technologies, and company initiatives. Our customers have always relied on our platform to improve developer happiness and satisfaction, and the addition of OverflowAI into Stack Overflow for Teams is another big step in that journey.

Looking for a partner in your GenAI journey?

Let us show you how to build a world-class knowledge base with Stack Overflow for Teams.