Analytics-Enabled QA: It's Time!
Quality assurance has always been considered a tedious task, one that supervisors and QA specialists avoid, even while they appreciate its importance. Since QA is labor-intensive, most companies limit its scope (and, consequently, its benefit) and evaluate only 3 to 10 calls per agent per month. This means that the findings are statistically invalid, and if you catch something important, it's probably by accident. (Few organizations quality-assure even this many of their non-call interactions—emails, chat sessions, SMS, social media posts, etc.)
Even this limited amount of QA has been shown to motivate agents to comply with department requirements, as they know that they could be monitored at any time. This alone establishes QA as mission-critical to the enterprise, not just the contact center. But there needs to be a better way of doing it, and this is where speech analytics comes into play.
The Automated Way of Doing QA
Analytics-enabled QA leverages speech and text analytics, business rules, and automation to identify, classify, and rank calls and agents that require management attention, either because they are very good or really bad. The automation can also be used to identify and classify operational issues caused by other front- and back-office departments such as sales, marketing, credit, billing, statement rendering, payment processing, manufacturing, product design, and fulfillment.
Injecting analytics into the QA process yields a range of benefits: It allows companies to cost-effectively evaluate 100 percent of all customer interactions; it gives managers a clearer picture of both customer issues and agent performance; it improves agent performance, since the staff is aware it's always being monitored; it automates the process of identifying contact center and enterprise trends and opportunities; and, finally, it makes supervisors more productive, as they no longer have to listen to dozens of routine calls just to pinpoint the handful that require their attention.
Taking Advantage of Speech Analytics
If you've invested in post-call (historical) speech analytics, it's time to take the next step and use analytics to automate and dramatically enhance your QA program—you already own the software, so why not get more use out of it? Start by reaching out to your speech analytics vendor and ask them for help in developing an analytics-enabled QA initiative. Here are some proven best practices to help you get it right:
• Identify the personnel who will be overseeing and using the application on a daily basis and have them trained so that they can actively participate in the implementation and take over management of the system. In addition, with the vendor's implementation staff, develop a project plan for transitioning your manual QA program to an automated, analytics-enabled program. Identify the system feeds and integrations required to capture the types of interactions that will be handled by the new QA process—e.g., calls, emails, chat sessions, SMS, social media.
• Review and enhance the existing QA evaluation forms. If you've performed QA just on calls in the past, create alternative evaluation forms and criteria for other channels. Decide which aspects of the QA evaluation process will benefit most from applying analytics; work with the vendor to create the QA evaluation search criteria.
• Pilot and test the new analytics-enabled QA process. It will take time and effort to ensure the application is capturing the right types of transactions; even a packaged application will require a great deal of trial and error.
• Define and build out the dashboards and reports to make sure the information provided by the new analytics-enabled QA application can be applied on a timely basis.
• Retrain all managers, supervisors, QA specialists, and agents so that they know what to expect and how the new system and process works and impacts their jobs.
• Evaluate and improve the application's effectiveness on an ongoing basis to improve results, findings, and benefits. Three months after the implementation, evaluate and benchmark its performance.
Speech and text analytics do not eliminate the need for supervisors or QA specialists to monitor calls or to review emails, chat sessions, SMS interactions, or social media posts. When situations are flagged by an analytics application, the QA specialist or supervisor still needs to listen in or review the interactions to confirm the solution’s findings. But by automating the process, management gains visibility into the contact center’s performance in a more timely way, and is positioned to more rapidly respond to issues. And this will only improve the customer experience.
Donna Fluss (firstname.lastname@example.org) is the president of DMG Consulting, a provider of contact center, analytics, and back-office market research and consulting.