Q&A with IBM Watson's call center AI tech lead

We sit down with IBM's contact center AI lead to talk about chatbots and their evolving role as human customers and support agents learn to coexist with them.

Rob High, IBM Watson VP and CTO, finds himself having to convince the world of two seemingly contradictory ideas: that call center AI can automate many tasks now done by humans and that chatbot and conversational AI tech doesn't necessarily act in ways humans expect it to -- yet. We caught up with High to talk about the present and future of Watson in the contact center.

Where does Watson Assistant fit into call center AI tech stacks?

Rob HighRob High

Rob High: Watson Assistant is a callable service that has an API or it can be hosted in the cloud. Its primary goal is to facilitate the creation of conversational agents.

I make a distinction between conversational agents and the term chatbots. Very generically, we might call [Watson Assistant] the chatbot. I've gotten a little bit sensitive to the term chatbot because I think so many of the examples of where chatbots exist in the marketplace today, they're really being used to provide kind of a single-turn interaction where the client asks a question and then it responds to it or [the client] issues a command. And then, the chatbot responds to it. That's useful, but not powerful.

If, for example, I ask, 'What's my account balance?' I may need to know my account balance, but that's not actually my problem. My problem is getting ready to buy something, or I'm trying to figure out how to save up for my kid's education. There's something deeper behind the question. A conversational agent [can get to] the question behind the question -- the real problem the client has. That, to me, is critical to engagement.

Contact center managers are beholden to real performance against real metrics. That and GDPR. There isn't always room for experimentation. How do they start with call center AI?

High: When you have a conversational agent and you place that in the middle of multiple channels -- mobile apps, smartphone apps, web app channel, Twitter, WhatsApp or any of these other messaging platforms, telephone IVR [interactive voice response] -- all those channels can be backed by the same conversational agent and be sort of your first-line-of-defense client interaction.


Learn how to use more advanced dialog features for
your bot.

Some questions get asked a lot -- frequently asked questions. [Other questions] people don't ask quite as often. They're a little bit more complicated; they fit into what we call the 'long tail.' If we can offload those frequently asked questions and have them resolved by the conversational agent, then things happen:

  • It dramatically reduces the time to engagement; there's no wait time, no hold time to get those questions answered.
  • Questions [are answered] consistently. You're not going to get a different answer depending on which call center or agent you end up with.
  • A [call center AI] conversational agent can move down into long-tail questions that people ask and, in many cases, can answer them without resorting to have to turn over to a live agent.

At a recent trade show, about the nicest thing call center managers said to us about call center AI-powered chatbots was this: 'Sounds like they have a lot of potential … for somebody else's call center.' How do you get people like that to trust your technology and drive adoption?

A conversational agent [can get to] the question behind the question -- the real problem the client has. That, to me, is critical to engagement.
Rob HighIBM

High: If you can take the top 10 or 50 FAQs, you already have the answers. You can quickly test and validate for yourself what benefit you can get from that. As you get comfortable with it and see its utility, then you can always open that aperture to either more of the population, more of the customer base or to a broader set of questions and grow from there.

Many of our clients began that way. Royal Bank of Scotland is another really good example of this where, not only did they grow, today, they've grown [their AI implementation] to cover the vast majority of their product portfolio in terms of both enrollment into those products, as well as servicing those products. They've gone as far as to actually establish a new role within their organization who is dedicated just to the experience that it creates -- the chit-chat characteristics of the conversational agents -- so that it's actually feeling more socially engaged with you.

What is the potential for voice commerce, and where does IBM fit into all that?

High: We are embarking down a path in which voice-activated interaction is really only the tip of the iceberg. There are plenty of examples where a voice-based conversational agent is appropriate. Alexa at home may be an example of that. In the car, we've already seen the benefits. It's better just to be able to voice [commands] than to type it in and take your eye off the road.

On the other hand, there are lots of social circumstances where people aren't comfortable with vocalizing their question, vocalizing their order or things like that in front of other people.

If you think back at the history of modern computing, you know, go back the last 70 years, when [John] von Neumann brought modern computing into the world, our interactions with machines have largely been defined by their constraints of the machine and have been largely two-dimensional -- keyboards and displays. In the early days, there were punch cards. Later, it was TTYs [teletypewriters], terminals with the green screens. Even now, our interaction with machines is mostly through entry fields and text boxes and radio buttons and sliders and things [like] that.

I watch my nephew interact with his tablet. It's almost like second nature to him. But it's still not natural. It's not sort of the same style of interaction that you and I would have talking to each other as humans.

And I think that this is about to blossom, that we're going to see forms of user-machine interfaces which are more defined by what is natural for humans, a natural form of communication, a natural form of interaction. And that's not going [to] be simply about a computer that passively listens to your voice, but rather more proactively creates a sense of presence with you.

Part of that might be voice-activated, but it will also be forms of visual response which come closer to representing what you and I as humans represent and would recognize as body language, eye contact and facial expressions.

Editor's note: This interview has been edited for brevity and clarity.

Dig Deeper on Marketing and sales

Content Management
Unified Communications
Data Management
Enterprise AI
ERP
Close