What do people think of engaging with chatbots when visiting a company’s website? For the most part, people do not like to be duped into thinking they are interacting with a real person, but when the chatbot makes a mistake or is unable to resolve a problem, people were actually relieved to find out they were not speaking with a real person, according to a study that was published recently in the Journal of Service Management.
“A chatbot is more likely to be forgiven for making a mistake than a human,” said Nika Mozafari, a research assistant in marketing and innovation at the University of Gottingen, and the lead author of the study, in a published report. The customer blames neither the company nor the bot, and may even have a positive reaction.
But when it comes to what are deemed to be “important or critical” conversations, such as financial issues, people reacted negatively after learning they had been interacting with a chatbot. That scenario weakens trust, researchers found. In the event, though, that the bot is unable to work with the person to find a solution, the disclosure leads people to be more forgiving, which can actually improve loyalty.
To test its theory, the researchers had 200 participants use an online chat session with their energy provider to try and update the address on their account following a move. Half of the participants were told they were interacting with a chatbot and the other half were not. The researchers investigated the impact of making the disclosure as well as considering whether the chatbot was able to resolve the request or not.
There is a school of thinking in the accounts receivable management industry that individuals with debts are embarrassed to discuss the state of their financial situations with other people and are more comfortable using text messaging, email, or online chats to negotiate and try to work out payment arrangements. As more companies consider the use of chatbots or other forms of artificial intelligence or interactive voice response tools to automate more of the collections process, knowing whether to tell an individual that he or she is interacting with a machine and not a human may help or hurt that negotiation. It is important for companies to be transparent, but it is also important to build processes that collect the most money. Managing that balancing act will become more and more tricky as more of the collections process is automated.