Business

Chatbots-2

What’s With All The Crappy Chatbots?

Underwhelmed by your chatbot? You’re not alone. Filament’s Rory McElearney – an actual human – offers some comforting words

The market is currently seeing a proliferation of poor chatbot implementations. Part of the problem is a lack of understanding of chatbot user experience (UX) and the rigidity of frameworks on which they are being built. Another part is our high expectations of this form of technology. So what’s the current state of the art? And will we all ever truly love our ubiquitous chatbot companions?

Conversational UX – rushing to a new (old) medium

It seems you can’t go a week without running into a new way to make chatbots. Whether it’s an open source code library for building the bot engine [1], big players such as Google’s api.ai or IBM Watson that manage conversation flows on top of their Natural Language Parsing (NLP) services, or a fully-fledged drag-and-drop platform for ‘launching a bot in seven minutes’ [2], there can be little doubt that the massive appetite for bots is being served. But since their dramatic launch a few years ago, it does feel as though remarkably little progress has been made. The economic case for chatbots remains strong, as evidenced by the flow of money into the sector, but many customers have been decidedly underwhelmed by their first contact.

The chat interface is nothing new, since giving orders via the command line was the original way of communicating with a computer. In many ways, all we’ve done is come full circle. However, since those early days, we have all become accustomed to navigating by screen, mouse and touchpad. All the factors that made a graphic interface preferable for the majority of people haven’t suddenly gone away, so what has changed to fuel this rise in demand for chatbots?

Two things stand out. Firstly, users are choosing to spend more time in smartphone messenger apps at the expense of their desktop devices. Since it’s almost impossible to entice users to download a new app, the old adage stands that companies must go where their customers are – into the chat space.

Secondly, access to advanced machine learning algorithms has become increasingly straightforward. An Application Programming Interface (API) economy which grants affordable access, one drip at a time, to a fortune of AI and machine learning IP is being used by developers who want to build machine learning into their products. The APIs powering the chatbot boom come from the field of Natural Language Parsing (NLP) or Natural Language Understanding (NLU). These allow bots to take the words that you type and detect what you really mean from this natural language. They can even deduce your mood as you’re typing or how you feel about specific topics based on your choice of phrasing. This is an exciting field, with breakthroughs being made all the time. Even so, turning this potential into an engaging chatbot has proved to be a difficult task.

Realistic expectations?

It is possible to build chatbots whose responses are generated programmatically but beyond simple chit-chat, anything remotely complicated will require the bot to make a call to a database or some other external source. If there’s a button on a website that requires a call to a database, a developer needs to write code for it. The same is true for bots, as you can no more ask a bot to perform an unprogrammed action than you can ask an interface for a non-existent button. Chatbots present a unique challenge because unlike the graphic interface, where the aim is to clearly display all options, part of the allure of chatbots is the illusion that it understands you. Spelling out what you can and can’t do risks breaking that spell.

The best bots are the ones that naturally draw users into asking them about what the bot knows best. Consider the perceived effectiveness of Amazon Echo’s Alexa versus Apple’s Siri. Most of the time, Alexa only has to deal with changing songs or reordering loo roll, yet Siri is expected to deal with all manner of requests. Creating a bot that maintains the illusion of understanding becomes more difficult as users ask more of it because the bot can be confused by requests across a wider range of topics.

And that’s when human users are actually making our wishes very clear. Alexa’s success is, in part, due to her plugged-in-at-the-wall microphone’s ability to hear and understand more clearly than Siri. Up until now, the instructions humans have given to computers have been pretty unambiguous but on top of that, instead of a scroll and click, they now also have to try and interpret misspelled sarcasm.

Deciding which bucket a word or phrase should go into – which is what your bot must do in order to figure out what you’re talking about – is known in machine learning as a ‘classification’ problem. Deciding whose face is in a picture or making a film recommendation fall into the same category. The machine is making decisions based on training from humans, so if even a human would have a hard time differentiating between buckets, the machine has no chance!

“It’s not you, it’s me”

Even if what you’ve said has been understood, a bot still has to deal with the fact that humans are fickle creatures. They might not really know what they want, then they often change their minds. Then the chat interface challenges some of our most basic assumptions about design and UX by denying the user the ability to see where they are, or where they can go next. It removes the ability to easily go back to where they have been.

We have all become experts at navigating the flat space of a screen. Our eyes instantly detect navbars and dropdowns as if detecting the walls of a room we’ve just entered. These familiar web design conventions give this space a multidimensional quality. We scroll down to read more, up to read it again, left and right to go backwards and forwards in time like we’re reading a book.

Yet having a conversation with a bot can be unsettling because these conventions are of no use. If you make a mistake and want to go back, it’s not always immediately apparent how to do so. This can generate a momentary panic and loss of trust that will have a massive impact on the quality of your experience.

Another limitation of the chat interface is that it denies the user the ability to quickly browse results. The front page of a Google result is a dense, easy to scan block of information that would be far less accessible in a chat interface. Web pages ease the pain by presenting large amounts of information using a consistent layout. You know what part of the screen to look at, that the headers are in a different text style from the content. You can quickly scan in the knowledge that you don’t need to click on any result unless it catches your eye.

But in a chat console, there is no opportunity to lower the cost of scanning with clever design and layouts. Each line of text is as important as the next and the total number of words on screen at any time is limited by both the chat window and the user’s ability to digest them.

Building bots remains a fledgling art form

Despite all of these challenges, the forces driving the move towards chatbots are only going to grow stronger. Natural Language Parsing is going to grow better and cheaper, while lessons will be learned from the bots currently in production. Even users will help by slowly becoming accustomed to dealing with chatbots.

The tools the chatbot developer can employ will get more sophisticated too, such as bots’ ability to track the context of a conversation in a way that mimics human short-term memory. A skilled bot architect can also draw upon under-the-hood machine learning assets such as trained classifiers and knowledge graphs, empowering the bot with the language of its domain, then bottling that ‘expertise’ into a set of entities and relationships.

One of the most important things a bot builder can do is to be aware of the early stage of this industry, both in terms of the tooling available and user acceptance. When deploying a bot, it’s unlikely to be perfect out of the box. Every bot will eventually be asked a question out of its scope because how a developer thinks users will behave and how they actually will are two completely different things!

There are two key design principles to follow. One, the conversation must be designed with fallbacks and to fail gracefully. Two, a developer will need supporting analytics and tooling in order to iterate and retrain their bot based on this feedback.

Dating the Martian

Will we ever fall in love with chatbots?  For many users, talking to bots is like dating a Martian and at the moment, we’re all still on our awkward first date, with both sides talking past each other in between plenty of awkward silences. But as time goes on, we will adjust to the other’s foibles, learn each other’s language and, eventually, start having flowing conversations.

We would urge you all to stick with it because without question, the relationship between user and computing power is evolving. These first awkward bots and our stilted conversations with them are just the start of an ongoing relationship. Within the next decade, we will be absorbing computing power in whole new ways and augmented realities will touch every aspect of our lives.


Rory McElearney is a developer and chatbot UX expert at Filament. He has spent 18 months specialising in this emerging field, designing and deploying elegant chatbot solutions filament.uk.com brainsteam.co.uk


Leave a comment

Your email address will not be published. Required fields are marked *