by Johann Stoessel
Date Published - Last Updated February 25, 2016

So…how are your metrics? Do you need a benchmarking warm-up? Well, there’s no time like the present. Let’s get cracking!

For many years, I worked for a company that managed by feel. While I knew how many contacts we received and how many tickets we were creating, management had no real interest in either the hard data or any kind of formal reporting. While this management style is still alive and well in many workplaces, the bursting of the tech bubble and the recession forced many companies to adopt to a more structured management style. Mine was no different. Suddenly, I was being asked to generate reports that were more data-driven. To do this, I leaned heavily on the HDI Practices & Salary Reports, which helped me understand where my organization stood compared to others in my vertical and the industry at large.

Wait, what? You say you aren’t using one of HDI’s premier member resources?

Everyone, regardless of vertical, industry, or position, needs to set some time aside to review HDI’s annual industry reports. In the spring, HDI releases the HDI Desktop Support Practices & Salary Report; every other summer, the HDI Customer Satisfaction Benchmarking Study Report; and in the fall, the HDI Support Center Practices & Salary Report. All three are great tools that aggregate and highlight many key data points. Anyone who takes the time to review—let alone use—this data will be better off for the effort, I promise. Even if your company doesn’t look at metrics, I cannot stress enough the value you can gain by comparing this industry data and your organization’s data.

In my case, the amount of effort I put into the research was far outweighed by the traction my support organization gained. (If everything I’d done in my career to that point had such a great ROI, I could’ve retired at 35!) In support, it’s easy to be overlooked. Imagine your support organization is the drummer in a rock band, in the background laying down a solid foundation beat for the organization. As long as you show up to every gig (take every request) and continue to provide a consistent beat (resolve every request), no one really looks at you. But what happens when it’s time for your drum solo? Will you have the goods? Can you tell a compelling story, one that’s based on real data?

For me, everything started with a look at metrics; that baseline proved to be incredibly valuable, in ways I never thought it would. Start with what you have. Take customer satisfaction, for example. Regardless of how you measure customer satisfaction (e.g., survey of closed tickets, annual surveys, random surveys), you should be regularly measuring your organization’s progress toward a set target (goal). You should also have selected a reporting tool; everything in this article was generated using an old standby, Microsoft Excel.

The graph below illustrates a year’s worth of customer satisfaction scores (by quarter).

Compelled? I’m not. This doesn’t really tell me much, other than the fact that satisfaction went up in Q2, almost dipped back to Q1 levels in Q3 (yawn), and then the doors blew off in Q4. But what if we include our goal (92.5)? Better. We clearly exceeded our goal in all four quarters, but what does that really mean? This report still isn’t compelling.

What if we look at customer satisfaction in the context of other important metrics? Let’s consider volume. Incident and/or request volume is a very important metric. (If you aren’t measuring incidents or requests, make it a stretch goal for the future.) Without the incidents and requests (that is, tickets), support organizations don’t have much to do. So, how do we depict volume and customer satisfaction on a single chart? What are the chances our ticket volume will fall within the same scale? Not great. So let’s take a different approach to that data.

Interesting. By applying conditional formatting (orange is bad, blue is good), the change in volume is clear without needing to add that data to the graph. Right now, we’ve got a bass drum and some volume. But great rock bands are more than that, so let’s roll in some additional components.

The lead guitar provides a memorable riff, giving the audience its first impression of a song. For the support organization, that first impression comes from the average speed to answer (ASA). While it’s not always a representation of the service the caller will receive, it is the caller’s first impression of the performance. For simplicity’s sake, our sample data focuses on ASA for the phone. What’s the best way to layer in this new data?

Next up, the bass guitar: abandonment rate (ABD). This report is really starting to take shape! Our last two components are keyboard and vocals: service desk resolution rate and first call resolution rate. (We won’t get into definitions here, but if you want to learn more, I recommend HDI’s online glossary.)

Are you ready to rock?! This report is. Between the layered graph and the conditionally formatted data, we can see that this song tells many stories: 

  • It tells the story of the poor implementation that happened in Q4 2011, which was still driving high ticket volumes in Q1 2012 and raising metrics across the board. (This one requires a little root cause analysis, but the data are a key indicator.) 
  • It also tell the story of how the additional staff that were brought on in Q1 2012 really hit their stride in Q3 and Q4 and were able to deal with an increase in ticket volume.

We may not know everything about this support organization, but one thing’s for sure: its metrics rock! Management should now be able to lead with data, not by feel.

Finally, remember, no single report will include all of your data, and that’s okay. Even uncharted, this data can still help tell your organization’s story. But beware of adding extra metrics that don’t add value to story. More isn’t always better. Ask your peers and team members to review your report and tell you if it’s clear or if you’ve muddied the waters.

*    *    *    *    *

The data in our example was taken from the HDI Forums Benchmarking Project (names were changed to preserve the organizations’ anonymity). Members of the groups participating in this project are asked to submit quarterly data for a list of metrics selected by the group. (Many of these metrics aren’t included in HDI’s industry reports because they’re specific and important only to certain verticals.) Members also complete a profile that includes more than 150 data points, covering staffing, hours of operation and after-hours support, type of support, support channels, types of calls, devices and applications supported, data centers, support tools, training and certification, service level management, metric goals, performance metrics, and ticket types by volume.

It’s easier to benchmark against organizations that are very closely aligned, and the value these organizations gain from this level of effort is priceless. But you don’t need to join an HDI Forum to create reports and benchmark metrics for your organization. HDI has many resources, templates, and tools—please take advantage of them! Earn that Grammy!

So, one more time, all together now…When I say “bench,” you say “mark”! When I say “metrics,” you say “rock”!

 

Johann Stoessel has spent the past thirty-two years honing his support skills in multiple verticals. Regardless of the vertical, Johann has always pursued his number-one goal: connecting with people. In 2008, Johann joined the HDI Retail Forum and now, as program manager, Johann’s goal is to find innovative ways to engage these diverse groups and increase the value of the HDI Forums. Connect with him on LinkedIn and tell him how metrics and reporting have made your job easier!

Tag(s): benchmarking, metrics and measurements

Related:

More from Johann Stoessel :

    No articles were found.

Comments: