There is so much more information that can be gleaned from metrics beyond the raw numbers offered about self-help page views. In this second installment of the Self-Service Metrics series, Chris Chagnon offers key steps to take to turn data into action.

by Chris Chagnon
Date Published January 5, 2021 - Last Updated January 14, 2021

In part one of the Self-Service Metrics series we covered terminology and the power of analytics for understanding more about how our portals and hubs are being used. But the real power of these datapoints is in how we interpret them and look at them for meaning.

This second entry in the Self-Service Metrics series covers some strategies for examining this data, combining it, and using it as a decision-making tool.

Winning the popularity contest

One of the most fundamental numbers we have looked at on our pages is ‘views’. We like to know how many people are using our tools, and which parts of it they visit. But this data can be used to derive so much more information that just knowing which pages are most viewed. A popular page can be an indicator of many things; we can make assumptions or inferences as to why based upon the page’s content, but we may also want to check with our users to confirm.

One thing to do with popular pages is to try and figure out why it is so popular. Does it enable self-service, thus is very heavily used? Does it have information about a service that is very difficult to use?

When reviewing knowledge articles, it’s good to take a look at the article’s popularity. More popular articles can benefit from more frequent reviews, while articles that are rarely or never used may be prime candidates for the chopping block, or for integration into existing articles to reduce “signal to noise” problems with searching.

Timing is everything

By faceting this data over periods of time, we can start to develop trend report. This can include which pages are most viewed, at which times are they viewed, and do we see that our self-service password page visits go up on nights and weekends when our service desk is closed?

We can start to make these sorts of connections by looking at pageviews in the context of the time they are observed. This can be used to confirm efficacy of other channels, such as checking clickthrough on that morning IT News email, or for follow-through on a new employee orientation batch. Combining time with pageviews is a powerful way to start noticing trends in your pages and content.

A second aspect of timing for content is duration, or time on page. By examining the average time on page for a piece of content, we can start to assess if it is well done. If we have an article that should take users five minutes to walk through, but the average time on page is fifteen minutes, that indicates there may be a disconnect or issue in the content that we should try and rectify.

Similarly, if the average time on page is very low, it may indicate a false positive or bad pathway for your users. This means they got to the page thinking it was what they needed but they navigated away from it quickly. This can be something as simple as a mistitled link, or indicative of a disconnect between user language and IT language to describe a topic.

The path

Analytics tools tend to think about source or referrer information in terms of advertising, but a side-perk of this data for self-service is we can view the whole story of where the user has been on our portals. They start on one link, then click to another, and another, until they either leave our site, or end up with another terminal action like submitting a help ticket. By examining the users’ path, we can create a narrative of how users are looking for information. A user may have started in a news post, worked their way to a service catalog entry, clicked through to a knowledge article, but then had unanswered questions, and submitted a ticket. This path is valuable to us because we can then go back in and update content as trends erupt.

In addition to narratives like the above, knowing the user’s full path can help us find content loops and dead ends, and also help us to find new content relationships or aliases we may want to add. A user looking for access to a shared drive may find an article on ordering hard drives through our desktop group, then work their way to account pages, then finally end up on a network storage or identity and access management page. This type of info is helpful in adding relationships to content, or considering how we can better serve users.


Chris Chagnon is an ITSM application and web developer who designs, develops, and maintains award-winning experiences for managing and carrying out the ITSM process. Chris has a Master of Science in Information Technology, and a bachelor’s degree in Visual Communications. In addition, Chris is a PhD Candidate studying Information Systems with a focus on user and service experience. As one of HDI’s Top 25 Thought Leaders, Chris speaks nationally about the future of ITSM, practical applications of artificial intelligence and machine learning, gamification, continual service improvement, and customer service/experience. Follow Chris on Twitter @Chagn0n.

Tag(s): supportworld, best practice, culture, employee engagement, employee satisfaction, quality management, metrics and measurements

Related:

More from Chris Chagnon


Comments: