by Rae Ann Bruno
Date Published August 15, 2019 - Last Updated August 15, 2019

Quality programs involve assessing the service delivery and process adherence of support teams. The goal is to improve the customer experience and increase customer satisfaction. Yet, too often, quality programs stop at the scorecards. And, scorecards are just that, a numerical score that people strive to achieve. Some organizations become so numbers oriented that the support professionals become focused on “hitting the numbers” instead of restoring service and creating a great customer experience. As a result, quality actually decreases instead of improving. Additionally, you can inadvertently override all the great customer service and problem-solving skills your analysts possess—what you hired them for!

So, what does it take to truly improve the customer experience? In short, alignment, balance, and coaching and continual improvement. Goals need to be defined, and the quality program needs to drive toward achieving the goals. Additionally, you need to balance numbers and quality elements in the quality program, and scoring and coaching need to be carried out consistently. Lastly, a quality program continually evolves. Through trending, scoring, coaching, and maturing processes, you can identify areas of opportunity. As new goals are defined, the scorecard will be adjusted.

Rae Ann will share tips for improving service delivery at Service Management World.
Join us!

Getting Started: Alignment

Too often, organizations build a scorecard without clearly defining (and documenting) their goals. What are the problems you are trying to solve? What are the goals you are trying to achieve? In other words, why do we need to establish a quality program? The why is what drives the scorecard elements.

Once the details of the scorecard are built, companies often become so focused on the scorecard metrics that they lose sight of the why behind the scorecard. When the why is not defined and documented, it often leads to an imbalance of metrics and scoring criteria. Often, the scorecards become more numbers-driven and less about the quality of how each customer experience is handled.

Often, the scorecards become more numbers-driven and less about the quality of how each customer experience is handled.
Tweet: Often, the scorecards become more numbers-driven and less about the quality of how each customer experience is handled. @RaeAnnBruno @ThinkHDI #QA #techsupport

Overall, you are trying to establish an environment where all of your support center analysts understand departmental priorities and goals and have the skills needed to manage each customer experience. For example, you might want to grow individual skills and improve the overall customer experience. You want your team to focus on that customer experience with every contact and to use their skills to work within the parameters of policy and standard operating procedures. When they understand the why behind the quality standards and standard operating procedures, they will use their judgement and take the right action.

Next, gather information that supports these goals/challenges. Once the why is defined, and information is gathered, you can build the foundation for your scorecard.  For example, when reviewing customer feedback, you find that often, customers are saying that analysts don’t listen and just follow a checklist. You listen to calls and review tickets and notice two things:

  1. Often, the support center analyst begins troubleshooting before asking enough questions to get all the details. In other words, they hear key words and begin working toward a solution.
  2. In moving quickly toward a resolution, the support center analysts often fail to acknowledge what the customer/user said and immediately begin to talk about the technology steps versus acknowledging the business issue.

By shortcutting information gathering, tickets are often categorized incorrectly, sent to the wrong queue, and lack necessary details to work toward resolution. This increases the time the user is unable to work. Additionally, by failing to acknowledge and show empathy, there are unnecessary escalations to management because of bad experiences. To change behavior and improve service, you set goals for the team:

  • Paraphrase what they hear, ask questions, verify the details shared, and document the details in the ticket. 
  • Empathize and acknowledge what the customer says or feels.
  • Correctly and accurately categorize and route the ticket.

Not only have you clearly defined the why behind the goals, but you’ve identified specific actions the team needs to take to overcome the current perception, become more efficient, and improve the quality of the customer experience.

This example shows how to lay the foundation for building the scorecard. Next, the focus needs to be on balancing quality efforts with numeric targets.
 

Building the Scorecard: Balance

To assess the quality of the customer experience, incident monitoring (ticket audits), knowledge monitoring, and call monitoring all need to take place. Based on the previous example, here are the scorecard elements and how to assess them:

Scorecard Criteria Way to Assess
  • Analyst acknowledged the emotion and/or business situation that the customer described.
  • Analyst showed empathy for the customer’s emotion.

Call monitoring. Listen to what the analyst responds and acknowledges.

  • Accurately identifies the issue and correctly categorizes the ticket.
  • Accurately identifies the issue and correctly categorizes the ticket.

Incident Monitoring. Ensure there are sufficient notes identifying the issue and documenting what the analyst tried and what results were experienced.

  • Searched the knowledge base to find the solution.

Knowledge Monitoring. If you have technology that allows you to see search words, review them and ensure they were in line with what the customer is saying. Make sure that the correct article (the one that resolves the issue) is attached.

 

It is important to recognize that the way each analyst shows empathy and acknowledges what the customer says might be different. Don’t require certain words to be said. When phrases are required, analysts sound unnatural and insincere. Also, instead of listening and using appropriate words or phrases, the analyst is focusing on making the specific statement. Common examples of required phrases include the following:

  • Say the customer’s name three times during the call.
  • Say, “I’m sorry” and “thank you.”
  • Ask, “Have I resolved your issue today?”

Here is an example of what might happen as a result.

Joe received a deduction on his quality scorecard because he did not literally say, “Have I resolved your issue today?” Yet, the reason that question is on the scorecard is to ensure that the analyst verifies that the resolution works and that the customer agrees.

Let’s review what Joe did say during the call:

Customer: “It’s working now! Thank you!”

Joe: “It is working? Great. Is there anything else I can do for you today?”

Customer: “No, that’s all I needed. Thank you so much!”

The analyst needs to  ensure that the customer agrees that the issue has been resolved (this is the “why”). It’s not about the exact words; it’s about checking to make sure it is resolved. Joe definitely did that. He responded to the customer’s comment, and it was natural and showed that he was listening and engaged. Let’s revisit the scenario and ensure that he meets the specific criteria of the scorecard.

Customer: “It’s working now! Thank you!”

Joe: “It is working? Great. Have I resolved your issue today?"

This is unnatural and makes it look like Joe didn’t really listen and is scripted. This would result in lower credibility of the service desk and lower customer satisfaction overall. This could also be the reason the customer satisfaction survey feedback reported that the service desk follows a checklist. It also strayed too far from the why.

This shows that not only do you need to define the overall why for the quality program, but also define the why for each element on the scorecard. Joe met the quality criteria in his call (the first example). He checked with the customer to make sure the issue was resolved. Yet, as the quality program evolved the criteria became too restrictive and the why was forgotten.

We hire people for their ability to listen and respond appropriately in any situation. Forcing support center analysts to follow a script on responses forces them to focus on their scores, not on being engaged in the call and managing the customer experience.

Frequent feedback from analysts who have rigid scorecards say that they feel so much pressure to avoid getting “dinged” on their quality scorecard that they don’t always do what is best for the customer. This is especially true when scorecards include numbers such as these:

  • ✔ Resolve the issue in 15 minutes or fewer
  • ✔ Keep after-call work shorter than 1 minute
  • ✔ Say the customer’s name within 20 seconds
  • ✔ Attach a knowledge article within 2 minutes

Analysts become focused on hitting the numbers and aren’t fully listening to their customers. Instead of listing rigid numbers to hit, keep it high level as in the example with Joe and allow the analysts to use their natural approaches and their great customer service skills.

The results will be service desk analysts who are really listening and helping their customers. They will make real connections, sound sincere, and facilitate service restoration and higher productivity—both for the customers and service desk.

Coaching and Continual Improvement

A quality program needs to evolve as the service desk improves its service delivery. It is important to solicit feedback both from customers and the service desk analysts on a regular basis (quarterly or every six months). Feedback can identify what needs to be modified or added to the quality program.

Also, keep in mind that the quality program is in place to grow analysts’ skills and improve the customer experience. It should be a way of recognizing improvement, not a punitive effort. Recognition and achievement are two motivational factors that engage employees. As the analysts become more proficient in their roles, their pride and confidence grows and they deliver better service. Here are three ways to keep analysts engaged and focused on improvement:

  • Let your team members hear their own calls and audit their own tickets and score them. During the coaching session, you can compare your assessment to theirs and provide constructive feedback.
  • Coach at least once a month; consistent coaching practices leads to consistent service delivery. Assess and score every month and spend face-to-face (or calls/Skype for remote team members) time on coaching. Make it a safe environment with open discussion. 
  • Make the coaching sessions interactive by involving them in providing improvement ideas. Ask them how they would handle it differently and what would help them to grow (e.g., training, ideas for responses, etc.).

In addition to coaching and soliciting feedback, it is important to document the responses. It makes it easier to identify what needs to be reviewed and what areas to improve. Also, if it is documented, it is easier to check against the why and avoid making changes that contradict the goals. 

For example, if you get feedback that the service desk doesn’t understand business issues, it is important to identify how to put quality criteria in place that will help to assess and grow their skills through coaching.

Issue or Feedback What to Review Areas
  • Didn’t understand business issue
  • Didn’t have proper sense of urgency 
  • Did the analyst listen?
  • Did the analyst ask questions?
  • Did the analyst search the knowledge base or previous tickets?
  • Process
  • Knowledge
  • Skills
  • Took too long to resolve the issue 
  • Did the analyst understand the problem?
  • Did the analyst gather sufficient information?
  • Was the ticket categorized correctly?
  • Was the ticket routed correctly?
  • Was there sufficient information documented so that the escalation point could pick up where first level analyst left off?
  • Was this first level resolvable?
  • Was there a knowledge article for this (and was it used)?
  • How long was it before the ticket was reassigned?
  • Was the response time within the service level parameters?
  • Process
  • Knowledge
  • Ticket documentation
  • Ticket management
     
  • The agent didn’t seem interested in helping me. 
  • Tone of voice
  • Comments made and questions asked.
  • Results (Did analyst own or reassign?)
  • Policy or facts? (Did analyst simply state policy or facts without providing an alternative?)
  • Vocal elements
  • Customer service skills

 

As you recognize trends or habits that need to be changed, build it into the scorecard. As needed, communicate what trends or comments are driving the changes, then educate, assess, and coach the team members and help them to grow their skills and wow their customers. Coaching needs to remain a constant; don’t stop coaching!

A Virtuous Cycle

It is important to remember that customer service and technical support is a profession. Not just anyone can do it. Pay attention to alignment, balance, and coaching and continual improvement to have a quality program that lets analysts’ natural skills emerge while meeting goals and identifying improvements and allows people to continually grow. You’ll have engaged employees who trust their companies. Loyal, happy support center analysts attract and keep loyal, happy customers.


Rae Ann Bruno is the president of Business Solutions Training, Inc., where she consults and trains in various areas of ITIL, KCS, communications, internal marketing, metrics, and process improvement. Rae Ann holds several ITIL certifications, is a faculty trainer for HDI, and is the author of Translating IT Metrics into Business Benefits and What Have You Done for Me Lately? Creating an Internal Marketing Culture. She is also a member of the HDI International Standards Committee.


Tag(s): supportworld, technical support, customer experience, workforce enablement, workforce enablement, metrics and measurements, balanced scorecard, quality assurance

Related:

More from Rae Ann Bruno


Comments: