There is no question that conversational AI in the form of chatbots has made day-to-day life easier for the majority of people. There is no need to listen to the frustrating automated telephone recording saying “press 1 for…press 2 for…” We can resolve many of our issues by visiting a website and using the online chat services.
Not only is the way that humans interact with each other hanging, but also the way humans are interacting with chatbots. The World Summit AI saw a range of AI topics discussed including what we are seeing now and what to expect in the future. What we are going to determine today, is whether conversational AI is as state-of-the-art as we think.
Conversational AI as it is today
You will have noticed that chatbots have their limitations, Siri, although amazingly, still doesn’t understand all that we ask her! This is mainly because of the general nature of the use. Chatbots should be used as a tool for a specific purpose. Right now, the perfect use of chatbots includes booking flights, hotels, changing reservations, ordering food, etc. Imagine these chatbots as one individual person you are communicating with.
The language used is something startup companies like Poly AI are trying to overcome. We are still not able to use natural language when talking to chatbots; the next wave of chatbots will probably be more flexible with regards to conversational inputs. This significant change won’t arrive just yet, but being able to speak naturally to a chatbot is high on the list of priorities.
What’s next for Conversational AI?
There are some startups that are committed to AI and business solutions, looking at the integration of chatbots in businesses. More often than not, chatbots are highly successful at improving customer support and increasing sales. One thing that would be really exciting would be if chatbots had the ability to understand human emotion. If this were possible, the advancements would be incredible.
Many CEOs agree. Mikael de Costa, CEO, and founder of Leadoo is also enthusiastic about this possibility. IT’s safe to say that humans don’t always do the best job at reading feelings. If chatbots could scan emotions and even thoughts to be converted into words, companies would gain a greater understanding of customer likes and requirements.
Other startups are looking into ways of capturing human facial expressions and converting them into biometrics data. This will allow for reading micro-expressions and even if a user has high pulsations just by scanning a face. Brainworks.ai is in the process of creating an app to do this via the smartphone of a user.
And the next step?
At the moment, we will have to make the most of the chatbots in place and the benefits they bring for business task-based problems. The ultimate chatbot is still a way of from now, but expect it to read your facial expressions and understand your language better than many humans. Chatbots of the future may know what you want before the words have left your mouth.