Date Published May 23, 2012 - Last Updated 7 Years, 140 Days, 11 Hours, 15 Minutes ago
Stunned silence. Then nervous laughter. Then full-blown laughter. That’s the sequence of events that transpired when I asked the senior IT leadership team of one of my clients to drastically change their service desk reports. Not stop doing them, mind you. Something more insidious: changing them, fudging them.
Let’s back up a second. They had invited me to facilitate a week-long offsite for them, and as part of this exercise, they sent me a list of the metrics they reported to the executive team and their peers, and used to run their business. (This was an internal IT staff, so they didn’t interact with customers or partners directly.)
As expected, the reports were gorgeous. There were KPIs galore and a dashboard that would have done the bridge of the Enterprise proud. Noncompliant measures beeped and buzzed and fired off alerts to all the right people in real time. Beautiful stuff. Until they realized that senior executives outside IT didn’t really care to know what was going on unless there was a problem. The whole situation was even more depressing when you considered the time it had taken to put those reports together.
If there’s one thing service desks are good at, it is collecting information and reporting on it. If there is one thing they are terrible at, it is being able to focus on the few instead of the many. But one way to find out if anyone is actually reading the reports you spend so much time building (and opportunity costs, putting in fields and collecting data from your incident, problem, and knowledge management systems) is to change the data. If no one notices, this is a prime opportunity to change what you are doing and shift your focus.
So if we are spending all this time collecting and reporting on metrics, why is so little done with this information at the strategic level?
In my experience, the single most common reason is misalignment between an organization’s “big picture” goals and the way individuals are measured. In other words, the intent of the organization’s goals is lost by the time it reaches the individual, whose only question is, “What does that mean to me?”
Most of what we use to measure service success involves measuring what is easy to measure, so-called “activity-based” measures. Activity-based measures are easy to measure and easy to manipulate. One example is average handle time (AHT), or the average length of time an agent is on the phone with a customer. This is easy to measure. It’s also easy to manipulate, since an agent can simply end the call when its AHT exceeds the goal. This contrasts with “outcome-based” activities, which are much harder to measure and harder for individuals to directly manipulate. A good example is customer loyalty.
Tip: By all means, include activity-based measures. Just don’t put goals on them. For example, you should monitor the abandon rate on your phone queue, perhaps as a lower-level activity, since abandon rates are a leading indicator of customer satisfaction.
Advanced Tip: Don’t attach goals to activities; attach them to outcomes. For example, one of the best ways to compromise the integrity of your knowledge base is to ask your team to create or update two knowledge base articles a week. This will result in exactly that many being created or updated—often in very creative ways! It’s better to set goals on the outcome; for example, customer satisfaction or customer loyalty. Since these are inherently harder to measure, and harder for individuals to manipulate, team members must ask questions and actually seek to understand customers’ problems before they can attempt to solve them.
How do you know if your organization has measures that aren’t aligned with the big picture? Well, how many of these statements describe your organization?
- Whiplash from high-profile, high-visibility projects that are launched with a bang and then fizzle out, only to be replaced by a new set of “transformative” projects the following year.
- Lots of fire-fighting and thrashing around, which generates heat, noise, meetings, and reports, but very little demonstrable progress.
- Crossfunctional groups that constantly have to go up and across the management chain to deal with issues, instead of members of different teams being able to work out issues directly.
- An inordinate obsession with customer satisfaction as a proxy for the general state of the client relationship.
Remember, in most companies, only a tiny fraction of customers ever contact you, even when they have a problem. (They usually attempt to solve it themselves, or they reach out to others in the community). Of this tiny fraction, an even tinier fraction responds to transactional surveys asking how “happy” they are. Managers often forget what customer satisfaction is built on, and that’s a mistake. I’ve seen significant bonuses paid out based on customer satisfaction scores. Good intent, poor execution. Customer satisfaction has its place, as does loyalty (which is typically far more useful). Just don’t fixate on a single number.
If you’re looking for ways to align your strategic intent with the behaviors throughout the organization, here are some concrete steps you can take.
- Come up with a litmus statement and align it with your company’s purpose and objectives. If a vision statement is what you want to be when you grow up, and a mission statement is how you are going to get there, then a litmus statement is a short-and-sweet statement that describes what your team is accountable for. One client came up with “We own subscriber loyalty,” another with “We are the advocate for the customer within the company.”
Tip: What’s the litmus test for your litmus statement? If you fumble for an answer when your grandmother asks you what you do for a living, you do not have a litmus statement.
- Come up with three or four balanced, high-level measures that align with your company’s goals, and use them to help you manage the organization. These should be crossfunctional and applicable to all teams. Sample measures could revolve around the “golden four”: increasing revenue from new customers; increasing revenue from existing customers; reducing the costs associated with serving your customers; and customer loyalty. Of course, they should be far more specific than that. And you will need other measures to run your business, just not at the senior management level.
Tip: Be sure to rationalize, (globally) clarify, and dramatically reduce your existing measures. They should be easy to explain, the intent should be easy to understand, and, as much as possible, they should be from the customer’s point of view. Here’s a good way to ruthlessly chop the number of measures you currently monitor: any measure that does not help with at least two of your goals should not be part of the senior team’s dashboard. If it’s critical, they can watch it at the project level.
Advanced Tip: At all levels, outcome goals that are crossfunctional and aligned with the business are fantastic. For example, don’t just celebrate when a software version is released; really celebrate when customers are adopting it and using it. This cuts across sales, marketing, engineering, support, professional services, training, and others.
- Come up with a few, highly-focused projects to help you realize your litmus statement (think people-, process-, and technology-related projects that help the entire customer ecosystem).
Tip: Involve your frontline teams in choosing, defining, and executing these projects. And when creating teams, mix it up in terms of experience, geography, skill levels, and even functions. Let your teams develop the messaging, including the WIIFM (“what’s in it for me?”) message for stakeholders. Listen to them and let them lead. Do this right and watch employee morale soar (and enjoy all the goodness that comes along with it).
- Measure your direct reports and your teams the new way, by guiding, not grading. Outcome-based goals force you to ask too many questions to really understand what is going on. Seek to understand before you seek to solve. There won’t be any simple answers; no good or bad. They will only make sense in context; for example, a high customer satisfaction score may actually be bad, if you’re going bankrupt in the process.
Guiding, not grading, is one of the most difficult management approaches. You will initially feel like you’re giving up a lot of control and flying blind. However, once you are free of the noise and can focus exclusively on the most important things to the business, you and your teams will find it very liberating.
Phil Verghis is the principal at
The Verghis Group
. He is a trusted advisor to service and support leaders around the world, helping them earn the respect of their customers and demand respect for their customers. Phil has won numerous industry awards and accolades over the years, and has been hailed as a brilliant strategist and innovator. He is also the author of The Ultimate Customer Support Executive, a longtime friend of HDI, and a past chair of the HDI Strategic Advisory Board.