Synergy: Aligning Training, Communications, and Metrics to Optimize Knowledge Management


by Cindy Smith


If your service desk already has a great set of processes in place, good news: with just a little effort, you can reuse many of those same processes to optimize and improve your knowledge management program! After all, reuse is one of the main principles of Knowledge-Centered Support (KCS). There are four primary areas where you’re probably already using service desk processes that can be incorporated into knowledge management: training, ticket management, communication, and reporting.

Training

You spend weeks training new analysts on your applications, processes, and ticketing system, but do you have specific training for searching, creating, and updating knowledge? In a recent HDI Research Brief on knowledge management in technical support, 33 percent of respondents reported that they don’t provide formal training on searching their knowledge bases, while 43 percent reported that they don’t provide formal training on entering content into their knowledge bases. It seems very counterproductive to commit time and resources to starting a knowledge base and then not follow that up with training on how to use it.

Training is one of the keys to a successful knowledge management program, and an investment in knowledge base training, specifically, can pay great dividends in both the short and long terms. Just remember that, however you structure your training program, it should emphasize both searching and contributing: effective searching can result in reduced call times, lower resolution times (that is, faster resolution), and increased customer and analyst satisfaction, while contribution training can result in more usable knowledge contributions and a faster time to publish. The best training approach is a multifaceted one, and one that you’re most likely already using in your other training processes: 

  • An overview session 
  • Documentation: 
    • The knowledge workflow, or how a submission becomes a published document
    • Search tips, including any advanced search features 
    • Policies and expectations for searching and contributing knowledge 
    • Knowledge document formats and features (including a style guide or formatting guidelines) 
    • Policies and expectations for updating knowledge documents 
    • Processes for viewing knowledge data and history in a ticket 
    • Knowledge reports 
    • Shadowing and mentoring 
    • Follow-up coaching and training (as needed)

    At my company, after a few days spent getting oriented, new analysts attend a one-hour training session with the knowledge manager. That training provides an overview of the knowledge base, expectations for the analyst’s role in knowledge management, different components of the system and knowledge documents, and best practices for searching, updating, and creating knowledge. The analysts then shadow more-experienced analysts and ask questions; the more-experienced analysts, in turn, observe the new analysts using the knowledge and answer their questions.

    Including basic knowledge management training in your new-hire training program will enable your analysts to hit the ground running—and they’ll be more confident and more productive.

    Ticket Review

    Most service desks have a ticket review process that evaluates ticket quality by looking at accuracy, the quality of the description provided, and the documentation of the steps taken toward resolution. To make this process even more comprehensive, the knowledge criteria associated with the ticket should be evaluated as well.

    Assuming your analysts link knowledge articles to their tickets, and that the activities in that process are tracked in a log, there are several knowledge-related processes in the ticket that you can and should review:

    • Was knowledge trail reviewed (that is, the knowledge trail established before the analyst received the ticket)? If this wasn’t reviewed, the analyst may suggest steps that have already been tried, which slows down resolution and frustrates the customer. 
    • Was an appropriate search attempted? If searches are too broad, they may not return closely matched results. In this case, your analysts may need more coaching and training to improve the quality of their searches and results. 
    • Was the appropriate document linked to the ticket? Verify that the analyst isn’t just arbitrarily attaching documents in order to falsely increase the percent of tickets closed with knowledge attached metric. 
    • Did the analyst email the document to the customer (if that feature is available)? To simplify the support process, either provide the customer with instructions so they can follow along as the analyst walks through the steps, or have the customer perform the steps unassisted.
    • If an appropriate document was not found, was knowledge created? Adding knowledge when none is found increases likelihood that the issue will be solved quickly the next time it arises (i.e., lower resolution time). In fact, if the knowledge is made available for self-service, the customer may not need to call at all.

    There are two primary benefits to including knowledge information in the ticket review process: it increases ticket quality and it improves the knowledge process and the content in the knowledge base. My team performs regular random ticket reviews for all level 1 analysts, and we have a process for requesting a ticket review when a ticket may not have been completed correctly.

    Communications

    Regular communication and status updates keep customers informed and increases their overall satisfaction. Communicating knowledge-related activities can be equally beneficial, and like ticket notifications, many knowledge notifications can be automated, keeping participants informed on a variety of knowledge document
    activities. Some helpful knowledge communications include:

    • Document published notification: These can be sent to both the document author as an acknowledgement and the owner as a notification that the document needs to be reviewed. 
    • Document to approve notification: If you have an approval process, this notification should be sent to the document owner for approval before publication. 
    • Document retired notification: This should be sent to the document owner. 
    • Comment received notification: This should be sent to the document owner so that he or she can respond to the comment and update the document, if necessary. 
    • Lists: Lists can be used to communicate information, and your system may be able to autopopulate these lists. For example, you might include a Top Documents list, so documents can be accessed without searching, or a New Documents list, so analysts can become familiar with documents before they’re needed during a call. 
    • Replies: Reply to comments and suggestions from customers and analysts. Acknowledge the source, let them know what was done with their idea, and recognize their contribution. One way we do this is by forwarding the comment to the author and the author’s manager.

    Consistent communication about knowledge keeps people informed, engaged, and aware of your processes and the important part people play in the knowledge management ecosystem.

    Reporting

    In addition to your no-doubt wide range of reporting related to service desk tickets, phone statistics, customer satisfaction data, and dozens of other metrics, reporting on knowledge can be an invaluable tool for showing progress, identifying areas for improvement, and illustrating the ROI of your knowledge implementation in terms of increased support productivity and decreased support costs.

    Knowledge metrics can be broken down into the following categories: 

    • Document attributes and measurements 
      • Lifecycle dates (create, publish, review, retire) 
      • People and groups associated with the document (author, owner, subject matter expert) 
      • Usage (hit and attach counts, last hit and attach dates)
      • People search and contribution measurements 
        • Documents by author 
        • Comments by author 
        • Process measurements 
          • Call deflection 
          • Percent of tickets closed with knowledge attached 
          • Time to publish 
          • Documents past review date 
          • Baseline and trends 
            • Number of documents 
            • Percent of tickets closed with knowledge attached 
            • Search trends 
            • Self-service usage

            Most knowledge metrics need to be looked at together to provide an accurate picture of your knowledge performance. While the top knowledge metrics usually involve quantities, those numbers alone won’t tell the complete story. Whether you have 1,000 or 100,000 documents in your knowledge base, it really doesn’t matter unless you can combine the number of documents with the percent of tickets closed with knowledge and the percent of documents used at least once.

            For example, if you have 1,000 documents and you close 75 percent of tickets with knowledge, that looks very good on the surface. But if you also can also see that you’ve only used 20 percent of your published documents at least once, your organization has likely spent a great deal of time on documents that aren’t useful, and these documents should be reviewed.

            However, even apparent successes should be scrutinized, both to verify that the knowledge is benefitting your organization and to determine whether your performance can be improved upon. For example, if percent of tickets closed with knowledge attached is very high, you may want to consider involving problem management to see if you can eliminate redundant issues.

            Analyzing searches and hits can also indicate whether the information in your knowledge base is easy to find. Note the ratio of customer searches to hits; if customers have to run an average of more than one or two searches to get a hit, you may want to more closely analyze the data and tweak your documents. One way to lower the ratio is to have documents available without having to search. For example, when you’re rolling out a new solution or service, provide a list of Top Documents or Announcements that link to relevant knowledge documents.

            Finally, pay close attention to two key metrics: percent of tickets closed with knowledge attached and call deflection. Call deflection is a great way to quickly calculate the ROI on your knowledge management initiative. The Consortium for Service Innovation’s “KCS Practices Guide” recommends comparing document hits by customer with tickets received within a certain time frame and counting the hits without a ticket opened by the customer as a self-service success; my team calculates this metric using a 24-hour window. Percent of tickets closed with knowledge attached is another important metric that can be used to calculate ROI, in this case by reusing known information to save time and to verify that the knowledge base is being used effectively.

            *    *    *    *    *

            Knowledge management metrics provide a great deal of valuable information about your implementation and are especially effective when combined with training, ticket review, and communications processes. You can achieve this synergy with little to no effort by simply leveraging the processes you already have in place—it’s that easy!

             

            Cindy Smith is the senior knowledge manager at McGladrey LLP. She has more than fifteen years of experience developing and managing corporate IT service desk knowledge bases. Cindy is HDI-certified in Knowledge-Centered Support principles and is also the VP of communications for the HDI vChapter. 

            Tag(s): KCS, KM, knowledge management, metrics and measurements

            Related:

            More from Cindy Smith :

              No articles were found.

            Comments: