The Digital CX Podcast: Driving digital customer success and outcomes in the age of A.I.

Digital Customer Success Maturity Model Assessment Launch and Post Churn-In Q&A | Episode 075

Alex Turkovic Episode 75

In this solo episode of the podcast, I spend a bit of time recapping a recent talk I gave together with Dan Ennis at Churn-In, ChurnZero's annual conference and also take some time to discuss the launch of the Digital CS Maturity Assessment, which you can go through by following this link for free: https://digitalcustomersuccess.com/dcsmaturity/

I then go through a bit of Q&A from the Zero-In session itself! See the chapters to jump to a specific question.

Chapters:
00:00 - Intro
01:18 - Churn-In Recap
01:36 - Digital CS Maturity Assessment
04:30 - How to Measure the ROI of your Digital program
08:41 - Which self-service mediums are most effective in the SaaS world today that help to promote advocates and reduce support costs?
13:58 - Do you have any books, podcasts or frameworks you would recommend for newbies?
16:13 - “Tech Touch”, Digital SC and Scaled CS interchangeability
18:21 - Incorporating contextual data in predictive modeling

Enjoy! I know I sure did. 

Special shoutout to:
- Dan Ennis: my awesome co-presenter at Churn-In
- Keishla Ceasar-Jones, Malachi Hopoate, Sylvanie Tweed & Tom Battle for their great questions!

Support the show

+++++++++++++++++

Like/Subscribe/Review:
If you are getting value from the show, please follow/subscribe so that you don't miss an episode and consider leaving us a review.

Website:
For more information about the show or to get in touch, visit DigitalCustomerSuccess.com.

Buy Alex a Cup of Coffee:
This show runs exclusively on caffeine - and lots of it. If you like what we're, consider supporting our habit by buying us a cup of coffee: https://bmc.link/dcsp

Thank you for all of your support!

The Digital Customer Success Podcast is hosted by Alex Turkovic

Speaker 1:

Today we've got some Q&A for you on the backs of the churn-in event that happened in DC last week, so stay tuned. Once again, welcome to the Digital Customer Experience Podcast with me, alex Turkovich. So glad you could join us here today and every week as we explore how digital can help enhance the customer and employee experience. My goal is to share what my guests and I have learned over the years so that you can get the insights that you need to evolve your own digital programs. If you'd like more info, need to get in touch or sign up for the weekly companion newsletter that has additional articles and resources in it. Go to digitalcustomersuccesscom. For now, let's get started. Hello and welcome back. This is episode 75 of the Digital CX Podcast. I'm so glad that you're back with me this week and every week as we talk about all things digital and CS. My name is Alex Turkovich. It's so great to have you back, and today we're doing another solo episode, because every fifth episode is a solo episode. It's going to be a bit of a Q&A episode on the backs of the churn-in conference that happened in DC last week, which was a spectacular event. I got to meet a lot of you, got, hopefully, a few new listeners out of the event as well, so welcome if you're a new listener out of that. But within the conference, dan Ennis and I gave a talk essentially about digital CS maturity, and so we have kind of an exciting launch announcement for you today around a digital CS maturity assessment that has gone live that you can go through as well. But the talk that Dan and I gave focused on the four pillars of digital customer success, which is something that I've talked about on the podcast before I believe it was episode 65 took you through those four pillars of digital CS. That's essentially your customer journey, married with your customer data and data hygiene to power your automations and tech stack, which is column number three, and then the fourth pillar or column is content that then powers all of those digital motions that you're doing. So we talked about the four pillars and we talked about four pillars and we talked about digital CS maturity within those four pillars. So at the end of the show or in the show links the show notes down below or the description if you're on YouTube there's a link for you to score on each one of those four pillars. You know, based on the answers that you provide. I think it's roughly like 20 or 30 questions or so. So you'll get a score at the end of it which should give you a sense for where you stand in your digital maturity on those four pillars and also give you some tactical, practical kind of next steps that you can use to then improve your own program on the maturity scale against those four pillars. So if you want to go to the show notes and check out that link, I'll also put a QR code up here if you want to scan it in your fancy on YouTube.

Speaker 1:

But all that aside, dan and I gave this talk. We talked about each of the four pillars, which, again, you can reference in episode 65 of the show, and then we did a little bit of Q&A and some of you submitted questions in the conference app as well. Some we answered, some we didn't. So I'm going to go through some of the questions that we got in the room, but then also some of the questions we got on the conference app, because I figured they're just as relevant on the podcast and to you listening as they might be to those that were in the room. So, without further ado, let me pull up some of these questions that were answered, and I will answer them to the best of my ability, but also would love your feedback, like if you're listening to this and you're like, oh, I've done this a different way or I've experienced this differently. Let me know your feedback in the comments or write me an email, or be nice about it, please, but I want to hear what you have to say about all this stuff.

Speaker 1:

So the first question that came up both in the room and in app is, essentially how do you measure the ROI of your digital CS program as it matures, which I think is a great question and one that's asked a lot. In fact, it's asked so much that I did a solo episode on this topic five episodes ago. So episode 70, I believe, is all about, you know, kpis and measuring the efficacy of your program. What I think is a nuance to this particular question, though, is as it matures, which is a yeah, it's a nuance to the question, which denotes hey, how do you track this thing long term? Right, and my answer to this. By the way, thank you Keishla Caesar-Jones for submitting that question, and her talk there was also amazing. If you don't follow Keishla Caesar-Jones, you should absolutely do that. She's amazing and an Austinite as well. Okay, measuring the ROI of your program.

Speaker 1:

I think you can high-level bucket this into leading and lagging indicators. I won't go into the weeds on this because go listen to episode 70. But fundamentally I'm looking at traditional quote-unquote CS metrics for my lagging indicators of how my digital program is performing. I'm looking at NRR, grr, churn rates, retention rates. I'm looking at NPS as a lagging indicator for how my digital cohort is doing, and that cohort is important because I'm not necessarily looking at it as a segment, although that is important, you know. Look at your unnamed segment, in other words those that don't have a CSM, but also look at the ROI on the segment of customers, or the cohort of customers that went through your digital motions versus those that didn't. That's the huge distinction in measuring the efficacy of a digital program is you're looking at things in the context of a cohort of who went through your stuff, who saw your stuff, who didn't, either on a company level or, more importantly, on the user level, you know. So, lagging indicators I think that all kind of makes sense. Those are traditional kind of CS, cx metrics On the kind of leading indicator front.

Speaker 1:

I am really focused on marketing metrics here I'm focused on engagement rates with specific motions because ultimately, a lot of what we're doing in digital is marketing campaigns. You're marketing to your existing customer base, right? So in your email campaigns, I'm looking at open rates and click-through rates and those kinds of things. I'm looking at engagement rates on in-app. I'm looking at engagement rates on whatever it is you're doing, right. Are your users engaging or are they not engaging with specific motions? And then the ultimate aspect of this is what is the attribution of that engagement look like to whatever it is you're driving? So the example that I've given a couple times now is you're driving feature adoption for a specific feature. Okay, great, of the cohort of users that went through your digital motion for that specific thing let's say it's an in-app thing the cohort that saw it and engaged with that motion versus those that didn't, was there a marked increase in the adoption and the usage of that particular feature or was there not? That is going to be a really great just-in-time indicator for like are your digital motions directionally correct, are they working, and those kinds of things. And tracking that over time, I think, is very, very important to see on the aggregate as you're stacking up these digital motions. Are you doing the right things in your program and are they having the impacts that you feel like they should be having? So that's a bit on. You know, measuring outcomes of specific digital motions and things like that. There's again, go listen to episode 70 if you want a little bit more details on that like that. There's again, go listen to episode 70 if you want a little bit more details on that.

Speaker 1:

A couple of additional questions that we got which self-service mediums are most effective in the SaaS world today to help promote advocates and reduce support costs? I'm going to be a little bit annoying here and say it depends, but I am going to give you specifics, though, as well. The depends part is what is the infrastructure that you have around you right? Do you have a community? Do you not have a community? Do you have an LMS? Do you not have an LMS? Do you have a knowledge base? Do you not have a LMS? Do you have a knowledge base? Do you not have a knowledge base? So the baseline is what do you have today that you can leverage from a self-service perspective and lean in hard on those things?

Speaker 1:

If you have a knowledge base but it's not really being used, you need to start training your customers to start using that knowledge base and using those self-serve resources that you have available today to really make sure that A they're being utilized, because they're not there just to look pretty, they're there to be used. But that training mentality needs to be part of your motions, where you're not just providing answers to customers, you're teaching them where to go to get those answers as part of every single motion. If you don't already have office hours available, that's more one-to-many and not self-serve. But office hours are like the go-to when it comes to initial motions to stand up that don't take a lot of effort in terms of people and don't take a lot of effort in terms of people and don't take a lot of effort in terms of tools or investment. In terms of tools, that's huge and you might classify that as self-serve, because somebody's coming to get some help instead of waiting for it. But yeah, what e-learning do you have? What resources do you already have available that you can lean hard on and to integrate into your digital emotions?

Speaker 1:

That said, you are going to want to do a gap analysis. You're going to want to really set a baseline for what do you have today? But, more importantly, what do you not have? Today that's being asked for and in the question. One of the things in here was how do you reduce support costs? That's actually a place where you're going to go find out what you need to answer Like what kind of resources do you need to provide? Go look at your training-related support tickets and see what commonalities there are to help prioritize what you need to go solve for in your self-help. So if multiple, multiple customers are asking about the exact same things multiple times, that is, like you know, prime territory for you to start building those self-serve assets to help combat some of those things and then proactively push those things out. Because if 20 customers are asking, I guarantee you 100 have that same question. They just haven't asked. So important nuance there.

Speaker 1:

Thinking forward on this a little bit, we are at the dawn of artificial intelligence and all that kind of fun stuff, and one of the early, one of the earliest use cases that has popped up in terms of customer facing and self-serve, but also one of the use cases that is becoming more rapidly becoming a reality. In fact, there are a lot of you know a lot of these things already standing up. Are, you know, large language model, real language, you know bots, ai bots that are trained on your knowledge base, are trained on your support tickets, are trained on your community, are trained on all these assets that you have out there, so that a customer can go to this chatbot and ask in plain English about a certain thing and they'll get a plain English language answer returned. Obviously, that takes a little bit of effort up front to stand it up and to train it properly and to do all that kind of stuff, but if you have the means and you already have some of this customer facing documentation, really go lean in on AI, because that ultimately, I think is going to have a massive ROI when it comes to reducing support costs and those kinds of things.

Speaker 1:

Hey, I want to have a brief chat with you about this show. Did you know that roughly 60% of listeners aren't actually subscribed to the show, on whatever platform they're listening to it on? As you know, algorithms love, likes, follows, subscribes, comments, all of that kind of stuff. So if you get value out of the content, you listen regularly and you want to help others to discover the content as well, please go ahead and follow the show, leave a comment, leave a review anything that you want to do there really helps us to grow organically as a show. And while you're at it, go sign up for the companion newsletter that goes out every week at digitalcustomersuccesscom.

Speaker 1:

Now back to the show. One question do you have any books, podcasts or frameworks you would recommend for newbies? I would recommend this podcast for newbies. There's a lot of content. One of the questions I got at the convention was like hey, I just started listening to your podcast, but you have like a lot of episodes, so where do I start? And I didn't have a good answer, which kind of felt bad. Good answer, which kind of felt bad. One of the one of my plans is to actually utilize my own advice and to um put an AI chat bot on the website where you can that that'll be trained on the transcripts of all the different shows that we've done, um to answer your questions, your digital CS questions, but I'm not quite there yet. Um plus, you know that costs a little bit of money. But frameworks, books, so books. Look, you know, nick Mehta, kelly Capote of Gainsight put together a great book, um book on digital CS it. It really lays out the foundation, um, so definitely check that out.

Speaker 1:

Um, you know, digital C, digital customer successcom is where all my stuff is. There's a bunch of articles that I've put on the website there and podcasts Well, I mean, there's one I can think of. Uh, is this one specifically? But I would also check out other shows like CE Lab is really good from a customer education perspective. There's tons and tons of content there and there is no shortage of customer success podcasts that have dealt with digital CS in one form or fashion, and so take your pick on those, because I think pretty much all of them have had at least one or two or three or five episodes on digital CS or have had interviews with various digital CS leaders. So I would start there. Jan Young and I have partnered and we're doing a, we've done a webinar and we're planning a course situation as well, which I'll put a link to in the show notes if you want to check that out. But yeah, there's a growing number of resources out there. Jan Young and I are doing basically the course that we're doing is a six-week workshop type scenario with community afterwards, which I'm super excited about.

Speaker 1:

Next question that we got was let's see and I'm going off the cuff here I'm finding that terms like tech touch get tossed around and sometimes used interchangeably with digital CS and scaled CS. How important is it to educate your organization about the differences? Great question, because they are kind of used interchangeably and I would say, fundamentally speaking, digital CS is the new tech touch. I don't think there's a lot of people who still use tech touch, but it's kind of fallen out of fashion, I suppose. But to me, digital CS and tech touch are quite interchangeable.

Speaker 1:

I do think there is a huge distinction between scaled CS and digital CS, though, and some may disagree with me on this. That scaled CS and the purpose of scaled CS is essentially in the name. Right, it is to scale your customer success function. That can take any number of different flavors. Whether you're looking at artificial intelligence, whether you're looking at a, you know, a specific scaled team to handle a certain segment, whether you're leaning hard on education, whether you're you, you're leaning into digital CS. I think, ultimately, digital CS and CS Ops are part of a scaled program. It gets confusing, though, because you also out there have scaled teams and typically your scale teams. That term is used to denote a team of scaled CSMs that are basically the customer-facing element of your digital motions right, so it gets confusing. But fundamentally, when you think about scaled CS, digital CS is a method of execution for scaling your CS or your scaled CS motion.

Speaker 1:

I hope that clarifies it and doesn't—I might hear it on that one from a few of you, so let me know. Okay, this is an interesting question and I'll give you a little bit of background after I read the question. So the question is how do you incorporate contextual data in predictive modeling? I see the value in knowing it, but quantifying it is a challenge. Okay, so one of the things that Dan talked about in our talk in relation to data is the different data types that you would want to pull into a digital program. One of them was around telemetry and product telemetry. One of them was just around your customer data. One was contextual data, so, for instance, industry data or what's happening in the industry and those kinds of things, and that is a little bit tougher to nail down, right, because ultimately, you know, it's tougher to quantify what's happening in the industry.

Speaker 1:

And so the things that I would classify as contextual data are looking at performance of competitors, for instance, or other competitors in the space, you know, looking at things like market capitalization of competitors or looking at your overall space in general. Is it growing? Is it not growing? What's the overall addressable market looking like? And then, going a layer deeper, what do your customers' employment rates look like? If you're serving a specific industry, you can start to pull together industry data into you know, into your modeling. That helps you to then identify, or helps your customers to identify, what it is that they need to drive for what, what outcomes that they need to drive for.

Speaker 1:

Now I know this is like super high level and part part you know, part of it has to be because you know some of you are building digital programs for customers that serve a specific industry. That's a little bit easier nut to crack because you know exactly who your ICP is. You probably know exactly, you know where. You know what your admin's, you know career trajectory is and all that kind of stuff. Others you serve a huge variety of industries just by nature of what your platform does, and so it becomes a little bit less predictive and that's where you're going to want to lean in a little bit more on your direct competitors versus just industry data itself, little bit more on your direct competitors versus just industry data itself. That said. Fundamentally speaking, what you're after here is any trending that you can pull in to your predictive modeling to help you better advise your customers on some of the strategies that they need to put in place for success. If you see the market growing overall, you might make different recommendations than if you see the market shrinking in a particular industry.

Speaker 1:

Those are the kinds of things that we're talking about influencing your predictive modeling, and what we mean by predictive modeling is you're taking all of that data that you have on your customer, your customer's usage, and you're trying to tease out not only who your different personas are and help you identify who people are within your customer's organization, but you're also trying to tease out what churn indicators exist. If you look at some of the historical data in your churned customers and you can do this either manually in spreadsheets or you can use machine learning and various platforms for this but if you look at, for instance, the data of your churned customers and find data trends among those churned customers, what you can then do is apply those things towards your existing customer base to help you identify in a proactive way who your churn risks are or who the customers are that you need to celebrate based on you know, certain success criteria and similarly, you can identify certain digital motions using the contextual data that you gathered from various industries or the market in general, to help trigger certain digital motions at certain times, based on market conditions. Simply put, that's one example of what you might do. But ultimately, you know you are there to help serve your customer and just because you have a customer that falls maybe 100% in a digital motion doesn't mean that you cannot be part of that customer's ecosystem of early warning or industry thought leadership. There are things that you can do to help your customers be contextually aware of what's going on around them, so that they're not operating with blinders on and in a bubble, but so they're operating in a contextually aware environment thanks to you, which helps drive value in your brand. So those are the big questions that were asked at Zero In.

Speaker 1:

If we met last week, we said hi, it's great meeting you If we didn't, but you're here because of Zero In. Welcome to the show. I hope this has been insightful for you. We are back with another phenomenal series of speakers on the show starting next week. I'll just let you be surprised. There's some really cool names coming up in the next few weeks.

Speaker 1:

We are probably going to do shows all through, probably the middle of December. We might take a couple of week break in December just for the holidays back in January. But yeah, we're marching on towards episode 100. And I couldn't be more excited for that. I have zero plans for episode 100. But if you have any slick ideas, please feel free to let me know. For now, I'll leave you with that. Hope you enjoyed the episode and we'll talk to you again next week. Thank you for joining me for this episode of the Digital CX Podcast. If you like what we're doing, consider leaving us a review on your podcast platform of choice. If you're watching on YouTube, leave a comment down below. It really helps us to grow and provide value to a broader audience and get more information about the show and some of the other things that we're doing at digitalcustomersuccesscom. I'm Alex Trigovich. Thanks so much for listening. We'll talk to you next week.

People on this episode