Chatbots - The New Voice Of Your Brand
ELIZA was the first chatbot ever created. Developed in 1966 at MIT, the very act of naming her sent a clear message of the expectation for chatbots to be as human as possible. But while ELIZA could only communicate through a pre-set collection of responses, today’s chatbots have come a long way in terms of both complexity and personality. And while their impact on customer service has been substantial, that’s been accompanied by a host of questions regarding ethics and transparency.
The need for chatbots
With companies seeing an exponential increase in customer requests across multiple channels, it’s become almost impossible to meet expectations around responsivity, both in terms of speed and quality. Recent advances in artificial intelligence and language recognition have mitigated the need for human agents and staff employed around the clock. With more companies seeing the benefits of chatbots, it’s not surprising that over half of enterprise companies are predicted to spend more on them than mobile app creation by 2021. It’s also interesting to note that Australian consumers are embracing and leading this evolution with amongst the highest levels of chatbot users worldwide.
After Turing, what’s next?
A recent project created by Foreign Object, a design and research studio formed by MIT students, clearly puts into perspective one of the challenges we’re now facing with chatbots – the notion that machines could soon be able to pass themselves off as human (and potentially even pass the Turing test). “Bot or not” is a short conversation-based game where you chat with either another human or a bot, and have to guess which at the end of the conversation. If you think you’re up for the challenge…
Truth or dare
With conversational bots disrupting traditional customer service, it’s raised an important ethical question for brands, potentially requiring them to choose between transparency and profit. A recent research paper suggested that undisclosed chatbots are as effective as proficient workers and four times more effective than inexperienced workers in converting customer purchases. More interestingly, the researchers discovered that disclosing the use of a chatbot reduced purchase rates by roughly 80%. Why? The research found that users who knew they were interacting with chatbots, found them less knowledgeable and empathetic. Ultimately, past experiences led respondents to have preconceived negative perceptions, irrespective of the bot itself. The researchers concluded by recommending organisations disclose the use of chatbots at the end of a conversation, in order to reap maximum benefits of duplicity whilst still remaining compliant with any existing “bot laws” in their state or territory. As the makers of Bot or not put it simply: “people make emotional connections and give one another the benefit of the doubt, even if it's just a stranger over the phone – which is precisely the emotional response that [Google] Duplex and others seek to exploit.”
Connecting with a machine
When evaluating brands, we already know our emotions come first. Building an emotional connection to the brand and your customers, whether it’s over the phone or through text, is crucial. With chatbots increasingly becoming a first point of contact, brands will need to work harder to keep abreast with the newest technologies and put forward an experience they are proud of in order to combat the pre-existing negative perceptions and provide a seamless experience.
That doesn’t just mean understanding subtleties of language or detecting sentiment and adjusting accordingly; but rather delivering a more personalised experience based on someone’s history – anticipating requests, even picking up a conversation where it was left off, will all become part of the new chatbot normal.
And while this is all very exciting and futuristic, chatbots are not set-and-forget solutions, and in reality, are still a long way from where they need to be. But at the end of the day, chatbots are brand ambassadors and should be treated as such.