Dynamic Business Logo
Home Button
Bookmark Button

How Generative AI can become a trusted money adviser 

Have you ever wished that your bank could offer you personalised financial advice, just like a financial adviser? And that this advice could be updated in real-time in response to changing personal and even wider economic circumstances.  

What if I told you that this is not a crazy idea and that, with the rapid evolution of generative artificial intelligence, it’s within the scope of any bank – provided they learn how to use it.

In the banking sector, applying AI to customer service isn’t new. Many banks and institutions already use chatbots to deal with simple customer queries, but these are powered by software that answers customer questions by selecting a fixed response (the Natural Language Processing (NLP) model) and are very different to the type of generative AI that, with the advent of ChatGPT, will soon become commonplace. 

This more advanced generative AI can offer the next level of customer interaction, including interpreting human emotions, with the potential to establish enough trust between customer and machine to become an asset — almost an adviser — as they help manage their money. 

A unique customer experience 

Imagine if that customer chatbot, usually sitting in the bottom right-hand corner of the screen, could help you figure out how long it will take to save a deposit for a house.  

No longer simply regurgitating pre-filled answers to FAQs, this new generative AI bot would help you plan how to save money based on a projected timeline and your income, while also making suggestions — in real time — on where to cut spending based on your habits. 

This sort of interaction is worlds apart from the way most banks today relate to their customers. To date, digital banking experiences have served to remove that kind of personalised experience that once existed, where an experienced bank manager would sit with their client, look at their account and provide tailored financial advice. 

Banks can also look to layer on the capabilities of more sophisticated AI, which comes with the ability to cope with a variety of unstructured responses, written and spoken. The data derived from this gives AI the potential to pick up on idiosyncratic phrasing and changes in tone to detect how a customer is feeling. 

Depending on the level of sophistication, the AI could then either tackle a problem itself or switch over to a human operator. If the line of questioning turned to fear and financial anxiety, for example, the AI could step in to direct the customer to specific support services designed to prevent debt cycles.

The AI can then “listen in” on these human interactions because a genuine AI system is something that learns from the data it is fed to carry out tasks typically requiring human intelligence.

Ethical and privacy considerations 

Yet, the above scenarios hinge on safeguarding the security and privacy of customer data. Collecting sensitive information and utilising it could pose a threat for unauthorised access or disclosing information about the wrong person.

It will also be incumbent on banks and other lending institutions to ensure the data being fed into the AI — and the AI’s interpretation of the data – is de-identified, and that its algorithms do not have in-built bias. 

It may seem trite, but unconscious bias can really derail people’s lives. I know of one instance where a major bank had issues with its employee recommendations because of the way “male” and “female” employees were coded (“male” option was presented first, and as a result, the algorithm didn’t account for them being equal as they’d not thought to include a ‘value’ as 1 to both). 

In the interim, there will need to be human oversight to address this. In the longer term, there’s probably going to be a tipping point where we have enough experience to assess the potential levels of fraud, and weigh that against the profit opportunity of this kind of personalisation and efficiency. 

Much in the same way as many financial institutions reimburse customers for fraudulent online transactions, the initial growing pains associated with embracing an advanced AI technology may require a cost before proving long-term profitability. 

The opportunity the industry will be eyeing is the next level of customer service, which will see AI provide personalised budget advice based on deep understanding of people’s spending behaviours. 

Forging these trusted relationships with customers via AI will also enable banks to build up a more personalised view of their customers – so when they do speak to them about more complex requests, they’re able to offer tailored advice from the outset. Something that might make employees happier too because they will be able to deliver greater value to customers.

While we’re still quite a way away from that model of customer service and experience, it’s on the horizon. Banks – especially the Big Four – have so much data at their fingertips, so it’s just a matter of finding secure ways to feed that information into AI applications and seeing just how far it can take us.  

Keep up to date with our stories on LinkedInTwitterFacebook and Instagram.

What do you think?

    Be the first to comment

Add a new comment

James Noble

James Noble

James Noble is the Chief Experience and Design Officer at Wongdoody.

View all posts