Magazine article CRM Magazine

Customer Experience, Trends, and Staff Planning

Magazine article CRM Magazine

Customer Experience, Trends, and Staff Planning

Article excerpt

If you were to ask a vice president of Customer Care the metric they think is most important, my bet is they would respond with "Customer Satisfaction Score." After that, all other contact center metrics we use to measure service are essentially proxies for these most-important-of-all contact center scores.

For instance, Service Level and Average Speed of Answer (ASA) are maintained because we believe long wait times lead to customer dissatisfaction. Abandons are a great proxy for customer satisfaction, too, because a customer who hangs up is almost always not happy with their wait time. And agent quality scores are maintained as the mechanism to ensure a consistently excellent interaction with customers.

DIFFERENT FLAVORS OF EXPERIENCE METRICS HAVE SIMILARITIES

Customer satisfaction, first call resolution, Net Promoter Score, agent quality score, and others count among many customer experience metrics. Internally, companies focus on experience scores that can vary from other business units that focus on customer scoring. But even if the scores are called the same thing, they are almost always calculated using different algorithms.

This type of scoring makes perfect sense as different customers--calling the same company--are contacting the contact center for different purposes. The experience must therefore be attuned to the purpose of the contact, although it doesn't mean that different ways of measuring a customer's experience don't share similarities. An example is that customer experience metrics have seasonality much like most other contact center metrics. As customers call for different purposes at different times of the year, their patience and expectations are likely to change. In the same way, an agent can be more or less motivated seasonally, and will score differently week over week.

Experience metrics are also differentiated by contact type, location, or staff group. A sales-oriented group and a service-oriented group will score differently, for example, even if they're taking the same type of call. Geographical centers likewise often score differently because they have different management. Ultimately, though, experience scores exhibit trends. As a workgroup improves or declines, or as company performance changes, focused training can positively affect the trajectory of an experience score.

HOW CAN PLANNERS USE CUSTOMER EXPERIENCE METRICS?

Similar to contact volumes, handle times, attrition, and shrinkage--time-series data that planning analysts typically work with--customer experience scores exhibit seasonality, trends, and differences across contact centers. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.