CFPB Says Chatbots Must Comply With Consumer Protection Laws
Learn why the CFPB is concerned about the impact of chatbot technology as credit unions and banks shift from relationship- to algorithmic banking.
Bureau issue spotlight examines role of chatbots in shift from relationship- to algorithmic banking.
Chatbots can be a real pain.
But used correctly, they can save credit unions and banks valuable resources.
However, if chatbots are used incorrectly, they can cause consumers to tear their hair out. And they may violate consumer protection laws, the CFPB said this week.
“Chatbots, which simulate human-like responses using computer programming, were introduced in large part to reduce the costs of human customer service agents,” the CFPB said in an issue spotlight. “Recently, financial institutions have begun experimenting with generative machine learning and other underlying technologies such as neural networks and natural language processing to automatically create chat responses using text and voices.”
However, deficient chatbots that prevent access to live, human support can lead to violations of the law, diminished service and other harms, according to the bureau.
Backstory and an Agency Warning
“To reduce costs, many financial institutions are integrating artificial intelligence technologies to steer people toward chatbots,” CFPB Director Rohit Chopra said. “A poorly deployed chatbot can lead to customer frustration, reduced trust, and even violations of the law.”
And the agency issued a stiff warning, stating, “The shift away from relationship banking and toward algorithmic banking will have a number of long-term implications that the CFPB will continue to monitor closely.”
In its issue brief, the CFPB said it has received numerous complaints from frustrated consumers trying to receive timely and straightforward answers from their financial institutions.
“Working with customers to resolve a problem or answer a question is an essential function for financial institutions—and is the basis of relationship banking,” the agency said.
The bureau said that like the processes they replace, chatbots must comply with all applicable consumer finance laws.
Additionally, the agency said that chatbots can introduce a level of inflexibility, whereby only specific words may trigger a response.
“Automated responses by a chatbot may fail to resolve a customer’s issue and instead lead them in continuous loops of repetitive, unhelpful jargon or legalese without an offramp to a human customer service representative,” the bureau said.
To make matters worse, consumers may find those “offramps” to human customer support service blocked with unreasonable waits.
“These obstacles not only leave consumers stuck and without help but seriously impact their ability to manage their finances,” the bureau stated.
Banking Technology & Innovation