The Digital CX Podcast: Driving digital customer success and outcomes in the age of A.I.
This podcast is for Customer Experience leaders and practitioners alike; focused on creating community and learning opportunities centered around the burgeoning world of Digital CX.
Hosted by Alex Turkovic, each episode will feature real and in-depth interviews with fascinating people within and without the CS community. We'll cover a wide range of topics, all related to building and innovating your own digital CS practices. ...and of course generative AI will be discussed.
If you enjoy the show, please subscribe, follow, share and leave a review. For more information visit https://digitalcustomersuccess.com
The Digital CX Podcast: Driving digital customer success and outcomes in the age of A.I.
Four Ways to Measure Your Digital Customer Success Program | Episode 070
Monthly Scale and Digital Meetup: https://digitalsuccess.gradual.com
The Ultimate Guide to Digital CS: 4 Pillars for Success w/ Alex Turkovic & Jan Young: https://zoom.us/webinar/register/6817263533053/WN_GqOn3y7JSZWBt_YM3i87kw#/registration
In today's show...
Measuring Digital Customer Success can be an elusive thing. It necessitates borrowing from all manner of different practices in order to do it effectively.
In this solo episode, I break down four areas of focus for establishing KPIs in DCS:
- Traditional CS Metrics
- Marketing Campaign Metrics
- Program Specific Measures
- Attribution
Chapters:
- 00:00 - Intro
- 01:14 - News
- 03:04 - Measuring your Digital CS Program
- 04:28 - Traditional CS Metrics in Digital
- 06:57 - Marketing Campaign Metrics
- 08:31 - Digital CS Program Specific Metrics
- 10:44 - Attribution
- 12:09 - Examples of Measuring Attribution in DCS
- 17:04 - How do you action unengaged accounts/contacts?
- 20:18 - Recap
Enjoy! I know I sure did.
+++++++++++++++++
Like/Subscribe/Review:
If you are getting value from the show, please follow/subscribe so that you don't miss an episode and consider leaving us a review.
Website:
For more information about the show or to get in touch, visit DigitalCustomerSuccess.com.
Buy Alex a Cup of Coffee:
This show runs exclusively on caffeine - and lots of it. If you like what we're, consider supporting our habit by buying us a cup of coffee: https://bmc.link/dcsp
Thank you for all of your support!
The Digital Customer Success Podcast is hosted by Alex Turkovic
We got another solo episode for you this week. This one's all about measuring your digital program. Once again, welcome to the Digital Customer Experience Podcast with me, alex Turkovich, so glad you could join us here today and every week, as we explore how digital can help enhance the customer and employee experience. My goal is to share what my guests and I have learned over the years so that you can get the insights that you need to evolve your own digital programs. If you'd like more info, need to get in touch or sign up for the weekly companion newsletter that has additional articles and resources in it. Go to digitalcustomersuccesscom. For now, let's get started. Hello, and welcome back to the Digital CX Podcast. This is episode 70. My name is Alex Turkovich so great to have you here today and, as you know, every fifth episode we do a solo show, and so that means today, you're going to be hearing me yapping about stuff. Specifically, what we're going to be talking about today is how to measure the efficacy of your program, what kind of KPIs to put behind it to measure its success. So before we get into that, though, I do want to talk about a couple things that are upcoming, that are worth noting. One is a scale and digital CS meetup series that I'm running along with Scott Wilder and Sam David. So we've kind of teamed up to run these regular sessions. The first one of these is coming up in October. We were going to do one in September but it got pushed. But we have some pretty exciting people on the lineup for this. The first one's going to be with Izzy Carey from Ramp. We also have Josh Schachter who's going to join us in November. We have Dan Ennis who's going to be joining us in December. Aaron Hatton's going to be joining us at the beginning of the year. So we'll do these once a month and it's a great way just to meet other digital CS practitioners. So if you're interested in that, there is a link to that down in the show notes, but it's at digitalsuccessgradualus. The second is I'm teaming up with Jan Young, who a lot of you probably know, because we're working on a Digital CS workshop. The first thing that you're going to see coming from us is a free workshop that you can attend. It's an hour long on October 10th and it's going to be all about just the ultimate guide to digital CS. We'll talk about the four pillars that I've talked about on the show before. We're also going to talk about how to use tools around you to engage customers at scale. To use you know tools around you to engage customers at scale. We're also going to be talking about how to get executive buy-in for your program and collaborate across different departments and whatnot. So really cool. Hope you join us for that. The registration link is also down in the show notes. If you want to go register for that, super excited about that. If you want to go register for that, super excited about that.
Speaker 1:So on to the main topic for today, which is measuring your program, the KPIs you need to succeed in digital CS. What I've done is kind of broken this out into four different, I guess, categories of measuring your program. Those four categories are well, the first one is traditional CS metrics, where we might apply some traditional metrics to digital CS and vary them slightly because it is effectively a different program. The second is looking at marketing campaign metrics because ultimately, we are driving a lot of marketing campaign style engagements with customers, so it makes sense to measure it that way. The third category I want to talk about today are program specific metrics. So what are the metrics about? How it is you're running your program and the activities that are taking place in the program. How are you going to measure those? The last is what I'll call attribution. So how are you attributing the activities and the things that you're doing back to an improvement of your business? And I would argue that that last category is actually a combination of the first three. So, without further ado, let's get into it.
Speaker 1:So, first off, let's talk about traditional CS metrics. And yeah, I'm talking about GRR, nrr, renewal rates, churn rates, those kinds of things that we typically see associated with a customer success program. The only difference there is you're not necessarily applying them to an individual CSM's book of business. Yeah, you're attributing them to a segment of customers perhaps, and the easiest way that I've found to go about applying those metrics to that segment is essentially, you know, not necessarily measuring a specific segment, but measuring the group of customers who don't have an assigned customer success manager. That works very well in my business and a lot of other businesses, and a lot of other businesses may not necessarily work well within your business, but that's one really easy way you know to attribute metrics is whether they have a CSM or not, and in this case you would be, you know, attributing that to customers who do not have a CSM. The other way as well is, if you do have a scaled team or a customer-facing team in place, traditionally a CSM, you might apply GRR and NRR to that CSM's book of business, for instance. Well, in this case, what you might be doing is applying that to customers who you have engaged with versus those you have not engaged with, have engaged with versus those you have not engaged with, right, and so what you're doing there is you're basically comparing two cohorts of customers customers that have been engaged by your team versus customers that haven't been engaged by your team and looking for an uplift in those customers that you have engaged with. There are some other, you know. There are some other metrics that can go along here. So one of the things to look at is CSQLs. For instance, if your digital team is booking leads into the sales funnel for various reasons, that's a great thing to look at. Net promoter score that's always a good thing to look at across your digital scale segment, if you will. So all of that essentially encompasses what we would classify as your traditional CS metrics.
Speaker 1:Your second bucket of metrics that you're going to look at is more marketing campaign related at is more marketing campaign related. Here specifically, I'm talking about engagement metrics with a specific campaign. Most commonly you'll see things like email open rates, click-through rates and those kinds of things Engagement rates with various campaigns. You could also apply that to your in-product campaigns, so things that are surfaced in product, whether a customer is engaging with those things or not. You're basically going to be measuring some engagement rates there. Um, for instance, if you want to look at um like Google analytics, um data from you know a specific portion of the website, for instance, a lot of communities will will have the ability to uh to get to give you the ability to look at Google analytics data you might you might want to consider you know where your traffic sources are coming from and those kinds of things. That's a little bit more of a stretch and and quite honestly, a lot of you know don't really play too well with those kinds of metrics. A best practice here is to definitely work with your marketing organization to see what kind of tools and analytics tools are already in place that you can leverage for your digital program and specifically the campaigns that you're running out of your digital program.
Speaker 1:Number three is where you're going to get into the weeds a little bit and this is a little bit more activities-based. We're talking about program-specific activities and metrics to track. So here I'm thinking about things like you know how many workflows and motions have been built and implemented. You could almost look at this like a product manager might look at. You know their sprints and sprints, planning cycles and delivery and all that kind of stuff. Definitely things like how many emails have been sent, how many customers have been engaged with. You know how many trials were initiated, which is something that you might or might not measure, depending on really how your software works and how the lead funnel works there. Opportunities initiated, opened, closed, those kinds of things. So activity metrics there you might look at the number of meetings booked. If you give your customers the opportunity to book meetings with your scale team, then definitely look and see how many meetings were booked versus held and those kinds of things. All parts of a really healthy activity dashboard for the team. It allows you to then look at past performance and set some goals going forward for the team on what activities they should be driving, based on what your norms are and based on what your goals for growth are An additional metric that I like to look at here, in terms of your program specifically, is how many human hours, or human effort hours, have you saved by implementing various automations and the efficiencies that have resulted from that?
Speaker 1:That one in particular is a fantastic one to use in the context of a business case. A lot of these can definitely play a part in that, but if you're writing a business case for expanding your program or even implementing a program to begin with, utilizing human hours saved and efficiencies gained is a fantastic way to incorporate and justify the costs of investing into a digital CS program. So we've gone through some traditional CS metrics. We've gone through some marketing campaign-based metrics. We've talked about some program-specific, action-oriented metrics. The last is really putting all those things together and we'll call it attribution metrics, and a simple way to think about this is what motions and things have you put in place? Who has engaged with those motions and has it had the desired outcome that you wanted, and by how much did it increase or decrease, depending on what your goal is?
Speaker 1:And I'll give you a couple examples here. Hey, I want to have a brief chat with you about this show. Did you know that roughly 60% of listeners aren't actually subscribed to the show, on whatever platform they're listening to it on. As you know, algorithms love, likes, follows, subscribes, comments, all of that kind of stuff. So if you get value out of the content, you listen regularly and you want to help others to discover the content as well, please go ahead and follow the show, leave a comment, leave a review. Anything that you want to do there really helps us to grow organically as a show. And while you're at it, go sign up for the companion newsletter that goes out every week at digitalcustomersuccesscom.
Speaker 1:Now back to the show, and I'll give you a couple examples here. Let's say you are looking at adoption metrics for specific features, and there is one feature that is both critical to a customer's longevity and stickiness, but also is one that sees historically low adoption rates within the first let's call it six months or so. That's a little bit irrelevant, but you have a feature. It has low adoption rates, and you know from the past that customers who have adopted this feature are more likely to stay with you in the longterm, so it's an important thing to focus on. So you set out to build a campaign that aims to increase the adoption of this particular feature.
Speaker 1:Now this campaign can be multifaceted. It could be in-product based, it could be kind of informative and training. In-product, it could be email based. Build the campaign out with multiple personas in mind, where you might want to target your administrators a little bit differently and more from a technical perspective, and your executives a little bit more on the ROI and the business case of adopting this certain feature. Right, there's no super prescriptive way to go about it, and so we won't necessarily talk about it that way. But you've built this campaign. It is multi-threaded, it's email, it's in product Heck. You might send out some physical mail, like whatever it looks like, and maybe you put an e-learning course together, some community posts and knowledge base. You know you've built out this campaign, this informational campaign that aims to increase the adoption and the implementation of a specific feature.
Speaker 1:You launch the campaign. There are some people who engage with your campaign, some more than others. There's others that don't engage with your campaign at all, and so now you have this cohort, these two or three cohorts of customers who either have not engaged with your campaign, who have engaged with your campaign somewhat, and those who've like fully engaged with multiple personas, those kinds of things, or you could just be binary about it, right Customers who have and have not. Whatever, you can keep it simple. Point is you have this cohort and over time, you will be able to measure the adoption rate of this particular feature in both cohorts and, ideally, you would see a substantial increase in the adoption rates for the cohort that did engage with your campaign, took your content and took some of the actions that you prescribed in that campaign and, as such, you're creating these attribution measurements for your specific campaigns that allow you to essentially say to the business hey look, we did this, it increased adoption by this and you know, happy days If you do that for all of your campaigns, that can be an incredibly powerful thing.
Speaker 1:It's also not easy, right, and you're not going to want to do it for absolutely every single thing that you do, because you know going from you know one onboarding email campaign to another may not, you know the juice may not be worth the squeeze. You know if you're taking a little bit of extra time to also think about how you're going to measure the attribution and the success of that campaign with customers who engage with you versus those that didn't. Another example of this might be, let's see, related to risk. For example, if you have a subset of customers who are at a high risk level because of a certain factor, you can build out a campaign to target those specific customers, those specific personas that may be exhibiting that risk, engaging with them in a meaningful way, and again, then you'll be left with this cohort people who engaged versus didn't engage and you'll be able to measure the delta between those two and, ideally, again, the customers that will be engaged with your campaign will be the more successful of the bunch.
Speaker 1:Now then comes the question okay, what do I do with the cohort of customers who have not engaged? Right, because a lot of folks will just stop there and say, hey, look, this campaign was successful. Here wasn't campaign, you know, successful here, you know, chalk it up to a win. We've made some improvements. Okay, but you know, really, for me, any kind of metric that you put in place where a customer exhibits like zero signs of being present, to me are almost more worrying than you know. Where there are specific warning indicators, and what I mean by that is where there are specific warning indicators, and what I mean by that is let's just use support telemetry, for example, because I think we can all relate to that.
Speaker 1:If a customer puts in let's call it a normal volume of tickets throughout the year, that's gonna be great because they're engaged with us. They know where to get help. Hopefully we're resolving those issues as they come up right. It's gravy Now if a customer puts in a high number of tickets, and especially a high number of tickets that get escalated over time. That's obviously an issue no brainer.
Speaker 1:But if a customer puts in zero tickets, to me that's a warning sign and I've always wanted to proactively engage customers who are not there. Nobody's home, the lights are off, right, there's zero tickets being put in over the course of a year. That's a sign to go engage with that customer. And I think that's a sign to go engage with that customer. And I think that's very similar with customers who don't engage in your motions.
Speaker 1:If you have a customer who is just not engaging in your emails or in-app or anything like that, chances are they are either not using the platform to its fullest or they have something against kind of digital engagements or something like that. That is where you're going to want to focus some of your human capacity on the team to engage those customers in an actual human-to-human engagement right. It's okay to pick up the phone once in a while You're digital in scale, but that doesn't mean go all digital in scale. That means you know pick up the phone when you need to. And those are the instances where I think actually, you know, having humans pick up the phone or, you know, try to schedule a meeting and those kinds of things are really really valuable instances of when you would use a human versus an automation.
Speaker 1:So I know I went on a little bit of a tangent there, but you know it's one thing to measure and to have these kind of metrics in place and dashboards in place and whatnot, but it's also, you know, important that we talk about how to action those metrics and how to action those things. And I think that attribution section is the gold from where you can both identify your successes and your warning signs in one fell swoop. So again, we went over traditional CS metrics, we went over some marketing stuff, some program stuff and then that attribution element of things. I hope this breakdown has been helpful. I would love to hear from you how are you measuring the efficacy of your digital program? What things have you put in place? What dashboards are you looking at Would love to hear from you.
Speaker 1:You can always email me it's alexatdigitalcustomersuccesscom and let me know what you're doing. Also, leave it if you're watching this on YouTube. Leave it down in the comments and whatnot. But I hope this has been helpful and I love doing these solo shows because it allows me to share with you some of the things that are on my mind. We'll be back next week with another series of pretty phenomenal guest interviews joining the show, so I can't wait to kick that next series off. So with that, thank you for listening and I wish you a good rest of the day. Thank you for joining me for this episode of the Digital CX Podcast. If you like what we're doing, consider leaving us a review on your podcast platform of choice. If you're watching on YouTube, leave a comment down below. It really helps us to grow and provide value to a broader audience and get more information about the show and some of the other things that we're doing at digitalcustomersuccesscom. I'm Alex Tergovich. Thanks so much for listening. I'll talk to you next week.