Skip navigation
  • Von: Grant Ronald
  • 09/27/2018

Speaking of the Future: AI Driven Conversational Interfaces

Artificial intelligence (AI), machine learning and chatbots have become quite the darlings of the Oracle event circuit. Every presentation seems to find at least one use case to demo AI-driven conversational interfaces through messaging channels. And why not, what could be cooler than mobile, cloud and AI addressing real world business problems in a way that is already a comfortable interface for users today. Here is a quick rundown on how to build chatbots.

This article first appeared in the bimonthly ORAWORLD e-magazine, an EOUC publication with exciting stories from the Oracle world, technological background articles and insights into other user groups worldwide.


The Building Blocks

All chatbot development platforms are essentially designed with the same building blocks. With Oracle Intelligent Bots, for example, we’ve packaged up these building blocks as a feature of Oracle Mobile Cloud Enterprise (OMCe) – making chatbots a first-class citizen in a multi-channel digital experience.

The building blocks are:



The use case that the bot understands as something it can actually do.

Training Utterances

Like a work apprentice, a bot doesn’t really know anything until you train it.


Variable elements or critical pieces of information in a conversation which need to be identified.

Natural Language Understanding

There must be a hundred different ways to ask to order pizza – the smarter the bot is at understanding language, the more successful your bot will be.

Dialog Flow

Like a river, conversations flow. This is where you define the bot conversation.

Backend Integration

Ýour banking or pizza bot can be as chatty as you like, in the end, it has to do something real in some backend system somewhere.

Intents and Training

Just like a real person doing a job, you need to tell the bot what use cases it is supposed to deal with, and by inference, everything else it should politely decline or pass to someone who can help. We call each of these use cases an intent. Furthermore you have to supply training data to demonstrate how to differentiate the various use cases. These are called training utterances.

Figure 01: For the bot you define intents and training utterances

Using a pizza chatbot example, we define that our chatbot can deal with ordering pizza, pasta, or displaying a menu. For displaying the menu, we give some examples of typical phrases that would be used for requesting a menu.

And this is where super cool AI comes in. Given that we can’t guarantee how the user will actually request to browse the menu, we build up a model which is used to calculate a probability that any received user phrase can be classified as a particular intent.

Now, with most AI based on machine learning, it thrives on data. The more good quality training utterances you give it, the better. But of course, the way to getting more real-world training utterances is from real user input. What this means is that you’re likely to have to start off by synthesizing a small set of utterances to get you up and running. One of the cool things Oracle Intelligent Bots offer is multiple training models: in particular one model which is better suited to small synthesized training utterances, and one which gives a higher accuracy as you provide more and more data.

So you might find that your initial development and testing uses one model but as you harvest more data you can switch to the model which gives you the full power of NLP (natural language processing) allowing the bot to better understand subtleties, slang, synonyms and other challenges of natural language.

Finding the Details

However, in understanding a user input and mapping it to an intent you are only half way there. In most cases an intent will have elements that are variable yet need to be specifically understood. We call these entities. “I want to order a large pepperoni pizza for 10 pm today” is obviously an order for a pizza, however, “pepperoni”, “large”, “today” and “10 pm” are variable elements of the input for which we have to pay special attention since they are unique (and important!) to each user’s request.

Entity extraction is the ability to define variable elements, such as date, time or numbers as well as domain-specific variables such as “pizza size” and “pizza topping”, which the chatbot can parse out of the sentence and assign into variables, typically ready to be passed to some backend system that needs to know the exact details of the pizza order.

Figure 02: Entities help give relevance to each intent

Conversation Flow

A conversation flows: from the initial greeting through to ensuring the chatbot captures all the relevant information to perform a specific task. In Oracle Intelligent Bots this flow is defined using a simple markup language that implements the various states, or steps in a conversation. Each step in the conversation is implemented with a “component” that performs a simple and discrete task. For example, a text component offers up a welcome greeting followed by an intent component which captures the user’s input, then branches to the appropriate point in the conversation based on how that input is resolved to a specific intent (or whether it is unresolved). Each step guides the conversation until, typically, a special component, called a system component, calls a backend web service to actually perform the desired action against the backend system.

Figure 03: The dialog flows from resolving the intent, checking if they’d ordered a pizza before and if so, displaying a list with Yes and No options

Is that It?

Well, yes and no. Those are the primary building blocks for developing a chatbot and a real-world chatbot will just be a collection of intents and entities using AI to resolve input and drive the conversation through the defined dialog flow. However, the success of a chatbot, and indeed a chatbot platform, is also based on other tools and features that support those core features. For example, the ability to deploy the chatbot through not only social channels such as Facebook, but also embed a chatbot inside your company’s website, or maybe within a corporate mobile app.

Or the ability for the chatbot to seamlessly handover to a call center agent should it require the intervention of a human. Multi-language bots, rich media capture, structured data capture, location based information and handling Q&A/FAQ styles of interactions are also other features which help a bot meet the needs of its users.

Find Out More

So that is the taster, if you want to find out more about Oracle Intelligent Bots you can check out our dedicated channel on, our home page on or follow us @OracleMobile.

Are you interested in chatbots?

Then come to DOAG 2018 Conference + Exhibition. Here, too, the topic will be discussed in numerous lectures. Grant Ronald will be there as well with his talk “When intelligence isn’t enough: realworld chatbot development”. The conference for Oracle users will take place November 20 to 23 in Nuremberg.

Further information and registration