Nothing stands still in technology for long, and while many users are just getting to know the traditional chatbot interface, plans are afoot to evolve to a more intuitive type of interaction. Get ready for wave two of chatbots, the future is certainly brighter for businesses and users!
Chatbots started out as a very metronomic process. Some text appeared on the screen and the user responded. Whatever the technology behind the screen, it was a simple, universal way to communicate in your home language. Now many bots offer multilingual support, adding greater accessibility.
To speed things up a bit sometimes the user just had to choose one of two or three bubbles, providing options to save them typing. Now we’re seeing bots that use emojis to drive the communication, bots that allow people to choose from a range of pictures or colours to help them choose a fashion item or home goods.
As the inevitable merging of virtual assistants and chatbots continues, we’re also seeing bots that talk and use speech-to-text to remove the need for typing. The likes of Siri and Alexa can act as bots to help us order pizza, do the shopping and build lists, all without so much as a keypress.
Moving chatbots on to the next level
But that was expected, as one technology borrows from another. What is new, is how the likes of Google plan to change how the typical chatbot screen looks and works. The new Google Assistant update provides many features that will change how bots work, and how people interact with a new generation of assistants.
For example, Google Assistant is bilingual, allowing it to understand a pair of languages as part of the same conversation. Also good for getting it to play tracks by Rammstein (other uses are available). The service also works on smart displays (what we used to call tablets), so you can ask it to show videos, or what’s on your smart home door camera and other visual content. Google isn’t just sticking to a text conversation. Instead, it will bring up imagery and video to help stimulate and engage.
Some of the features of the new app, coming to recent model Android now and iPhones soon, include:
- An interactive messaging interface so you can use your fingers to add a comma, change a word or make any other quick edits as you compose messages.
- On Android phones, it’s now easier to access an overview of your day. Open up the Assistant and swipe up on your screen to get curated information based on the time of day and your recent interactions with the Assistant.
- Developers and brands now have tools to take full advantage of the phone screen. Starbucks now has thumbnails to select from recommended items on their menus, Food Network has larger images of their recipes, and FitStar uses GIFs to give you a preview of your workout.
These might make chatbots more useful and engaging, and will likely see micro-bot engagements take over as we fettle and fine tune our calendars across a range of services.
Google isn’t ditching the chatbot
There’s much press chat about Google doing away with chatbots, but the reality is Google continues to buy new companies to gain better technology to build the next generation of products. This week’s recent acquisition of startup Onward, an AI-powered customer service bot, and developer of the Agent Q shopping bot shows Google is far from done with the technology.
Where Google goes, others will follow and the likes of Apple and Amazon will be hard at work trying to push their vision of the chatbot/virtual assistant further and make them more useful. What these final visions will look like could be anyone’s guess as they look for the next killer app, but expect other bot builders to expand the feature sets and capabilities of their bots to keep up with the pace.
SnatchBot started out like a scripted bot, but has since added natural language processing, voice features. It has linked to an ever-growing number of social media and chat platforms around the world to keep the service relevant to businesses and users.