In the call center world, self-service has been primarily deliverable to customers in the voice channel – namely IVR. But for all this time, you always knew you were in an IVR. Natural language speech recognition made a valiant attempt at trying to humanize the IVR experience, but it never really got there. Instead, frustration was more commonly the result.
Then chat and email channels arrived in customer service centers (or, by that point, contact centers) around the year 2000. A somewhat revolutionary thing that was now possible at this stage was referred to as auto responses, which involved a machine responding automatically with preset responses based on word spotting, and also preset responses that agents could choose while in the throes of dialogue with their customers.
Was this AI? Well, not really. But it was really the beginning of the application of knowledgebase responses to customer service inquiries, even if that knowledgebase was really just a small set of rules. Regardless, the rules knew to send this response. However, those rule sets could not learn; they could only be expanded upon by humans intent on studying their customer service organization events and then applying new rules.
Fast forward to the now, and we see the rise of artificial intelligence, or more accurately machine learning. Imagine a computer-driven system that, instead of executing programmatic activities to the letter of its programmer, was given the power of complex algorithms that can allow a computer system to learn from and make predictions about responses from vast amounts of data. Feed the computer system more data, and it continues to adapt its knowledge based on analysis of trends in the data. We see evidence of these machine learning networks everywhere these days – an obvious example being search engines trying to predict what you are wanting to search for and providing relevant responses despite spelling or grammatical errors or that thing that decides what to put into your trash email folder.
How does this relate to customer service?
If you’ve ever used chat to communicate with a business seeking customer service, there’s a good chance you were chatting with a machine, a chatbot, and there’s a good chance you didn’t even know it. The same goes for an email trail; you can’t be sure it was a human agent who typed that accurate and helpful response.
Imagine a customer service organization that could offer not only self- service in these digital channels, but also include the ability for these digital channels to actually be helpful without agent intervention. Chatbots are doing just that. They communicate in written form, some latency is expected in this semi synchronous communications medium, and there is no detection of a mechanical voice to be heard. Their responses are preprogrammed to communicate in a culturally relevant style. They’re able to communicate with knowledgebases that contain vast amounts of data to draw upon for solutions and answers. And lastly they (or their AI knowledgebases behind the scenes) have the ability to learn and adapt, and with every interaction comes more data the machine learning algorithms may exploit to formulate more accurate and helpful responses and solutions to customer service inquiries.
Does this mean that we no longer need agents? No. Nothing can replace human interaction in customer service, not yet at least. Any customer service organization would be wise to back these self-service engines up with human agents. In the same way that an agent sometimes needs to consult an expert, so too will machines need to consult humans – especially if that machine doesn’t have the data to feed its algorithm to compute a meaningful response.
Also consider machine learning being aided by and also aiding human agents (i.e. machines learning from the solutions that the human agents apply) and providing helpful predicted responses that agents could choose from when speculating on what response to provide to that customer for that issue.
Circling back to the IVR discussion: Remember those natural language IVRs that were so reviled? Well, those speech engines were driven primarily by a programmatic set of grammars. If you didn’t have the grammar written down for the speech engine to match phonemes to, it wouldn’t recognize what was being said. This is why speech applications required long bouts of tuning (listening to utterances and expanding the grammars to accommodate).
Machine learning can help here too. Driven mostly by search engines like Siri, and graduating into personal assistants like Alexa, these applications are learning to cope with the dynamic characteristics of human speech communications every day by listening and analyzing. IVR may also exploit those now much smarter speech recognition engines, for what is the likes of Siri and Alexa but an IVR that you don’t have to dial a number to reach?
Larry Brown is head of product management for Telax.
Edited by Alicia Young