Last month, Data Talks hosted Designing for Analytics founder Brian T. O’Neill for a discussion on human-centered design in data (and if you haven’t listened to the episode yet, grab a basket of laundry and settle in; it’s a good one). The interview is full of little aha moments, but my favorite is around 34:00 when we’re talking about embedded business intelligence success factors, and he says something that at first blush sounds obvious but then makes you check your assumptions at the door:
“And so what I would want to know from [stakeholders] is: You would like to put a BI implementation into your tool so that…? Fill in the blank. So that, fill in the blank. Because a BI implementation is a solution to some problem, and it may not be the same problem for every single SaaS company. They may have different types of questions they want to answer with the data. […] That part needs to be really clear or else the team’s likelihood of delivering success is going to be low. And I know this might sound really basic, but it’s not. I run into this all the time.”
As a content marketer, I’ve read dozens of articles about marketing success factors and value attribution. I can’t help myself. I’m always hoping the next one will clear up all the ambiguity and tell me once and for all whether my efforts are paying off. And I know that you are likely just as interested in measuring your success because I’ve authored a number of KPI-related articles, and they’re some of our most popular pieces of content.
But here’s the thing: grabbing a bunch of success factors from a listicle is putting the cart before the horse. Or, as O’Neill puts it, the cast before the break:
“It’s like going into the doctor’s office and saying, ‘Doctor, I need a cast on my arm!’ No good doctor in his [right] mind would just whip out the cast-making materials and send you on your way. The real goal is to heal the arm, not to just provide a cast because you asked for it.”
Whatever the analogy, it’s crucial to assess your pain points and goals before prescribing a solution and determining how to measure its success. Not only that, but your measures of success are likely to change as your requirements and implementation evolve.
Let’s say, for example, you’re SaaS CRM provider, and your primary reason for licensing a third-party embedded BI solution is because your IT team is swamped with reporting requests from customers. It’s taking too long for users to get the information they need to make informed business decisions, and you’re struggling to complete other projects. The market is starting to expect a BI component in applications like yours, and sales are suffering as a result. (Sound familiar? This is a pretty common scenario.) So you purchase an embedded solution and, for the initial release, simply expose a library of the most in-demand reports to users.
Launch day comes and goes. A month into your BI offering, only 5% of your users are accessing the reports you built. After two months, the number is closer to 7% and you’re seeing an average of 350 report executions per day. Your executives want to know, is their BI investment paying off? Is it working?
The problem with these business intelligence success factors is that they exist in a vacuum; we’re not comparing them to a benchmark or target. Moreover, they don’t speak to the BI implementation’s core objectives: reducing the reporting backlog and improving sales.
If we focus on the numbers we really want to see move in this case, our business intelligence success factors might look more like this:
- Number of monthly ad hoc report requests to IT (compared to pre-BI monthly average baseline, goal of 0)
- Number of weekly IT hours spent on report building (compared to pre-BI weekly hours, goal of 0)
- Number of monthly sales losses attributable to reporting deficiencies (compared to pre-BI monthly losses, goal of 0)
- Estimated monthly revenue loss, including customer churn, attributable to reporting deficiencies (compared to pre-BI monthly revenue loss, goal of 0)
- Product price (compared to pre-BI pricing, goal of $110/month)
Now we see that the initial BI deployment, limited though it was to a handful of canned reports, resulted in a 40% reduction in ad hoc report requests to IT and has saved the team an average of 15 hours of work per week. That’s significant progress, but there’s still a gap to close.
Once the outlined goals are achieved, your team gets to move the goalposts and strive for even higher objectives. Perhaps one day the number of reports executed per day will be a relevant business intelligence success factor, but it’s critical not to assume that it will. BI’s ultimate function is to support decision making, not generate reports. If that daily report execution number is high because users are frustratedly sifting through output, unable to find the answers they need, you have a UX issue on your hands. By all means, resort to industry-recommended business intelligence success factors, but only if they have significance to you.
Claire Suellentrop, co-founder of SaaS marketing and growth consultancy Elevate, is passionate about distinguishing between user engagement and derived value indicators. “It’s really easy for teams to measure the success of their product by metrics like: number of times a user logs in to the app or daily active users,” she said on a recent episode of Data Talks. But when Suellentrop was part of the marketing team at Calendly, her department realized early on that active user data, an industry-standard KPI, didn’t tell them what they wanted to know.
“The way that we could tell people were getting value from the product was whether or not someone else scheduled a meeting on their calendar. Because Calendly’s a scheduling tool (naturally), you can log in as many times as you want. But if you’re not actively sharing your Calendly link with other people and having them book times with you, you’re not using the product in a way that makes your life easier.”
Embedded BI use cases can vary widely among users with some simply consuming PDF reports sent to their inboxes, others exporting report output to Excel, and still others building and editing ad hoc reports. So before attempting to measure adoption, map out your product’s adoption funnel and keep in mind that one data analyst might be supplying dozens of users with the insights they need. Combining KPIs will help ensure that you see the whole story.
So, to sum up, here are the key best practices for measuring the success of your embedded BI implementation:
- Track metrics that reflect your BI objectives.
- Revise your goals as you meet objectives.
- Use adoption metrics that reflect derived value.
- Combine KPIs for a more accurate read on BI outcomes.
Apply these, and you’ll be sure to maximize returns on your investment.