CBA reveals promising regtech AI results, proceeds with caution

London lab researching regtech solutions in face of rising compliance costs

The Commonwealth Bank of Australia (CBA) has revealed the results of a regtech pilot that uses natural language processing and artificial intelligence to convert regulatory texts into compliance obligations.

Speaking on Thursday at the AI NSW Summit in Sydney, the head of CBA's London Innovation Lab, Supun King-Jayawardana, said the technology was able to crunch regulation documents into actionable compliance with a "genuinely surprising" 95 per cent accuracy.

The experiment – carried out in partnership with Dutch bank ING and overseen by the UK's Financial Conduct Authority (FCA) – applied AI technology from Chicago fintech firm Ascent Technologies to the 1.5 million paragraphs in a piece of banking regulation: Markets in Financial Instruments Directive II (MiFID II).

"All this regulation comes out as text, and that gets sent to the industry. An individual human being, a compliance officer, reads through that, they may go and consult with internal legal and external legal, and then they would say, okay let's create an obligations register. These are the particular obligations that us as an organisation or particular entity needs to be aware of," King-Jayawardana explained.

"That whole process is incredibly manual, and for the example we were looking at it took six months for a specific piece of regulation in the UK to do."

Ascent's technology provides a '200 step AI pipeline' that ingests the text and applies domain specific artificial intelligence. 

"These guys [at Ascent] are ex regulators and they obviously have a deep technical expertise. They say, well we can understand how regulators write regulations based on our experience of going through this. And they train models to understand the semantic profile of regulations to create an automatic obligations register," King-Jayawardana said.

What took CBA's compliance team six months, was completed by Ascent within two weeks. The obligations register created was scrutinised by legal firm Pinsent Masons and found to be 95 per cent accurate.

The positive result raised a number of questions, King-Jayawardana said.

"Is that good enough? Because you want to be 100 per cent accurate on this stuff; it's a regulated industry. The answer could be – well then how do you involve external legal to say let's look through this pre-defined list and add anything that's missing. Does that reduce the legal cost so it's still a win potentially?" he asked.

"Or does it mean we somehow tweak the algorithms to a point where the regulator is comfortable? And we don't know what that threshold is. For this I would say 100 per cent. 'Well we missed a few obligations' is not what you want," he added.

Regulator's role

The experiment also raised questions for regulators to consider. In a particular section of the regulation, the AI returned an accuracy score of only 65 per cent. "Not great, not great at all," King-Jayawardana said.

Further investigation found a likely reason – a specific line in the regulation referred to a 'member state of the EU'. 

"As a compliance officer you would understand the context, we know that though it doesn't necessarily say it, it means a financial institution. We're still a member of the EU so this will fall on us – it's not going to fall on the UK government," King-Jayawardana said. "But an algorithm, it says well CBA is not technically a member state of the EU."

In that case, arguably the regulation is "a bit vague" King-Jayawardana said. "If the [regulator's] end vision is to create machine readable regulation, lets make sure we don't write it like that because a human being can spot that but an algorithm can't."

The Australian government is already backing research into improving the readability of regulation, with CSIRO's Data61 working on pilots to present text-based rules as APIs.

Last year the Australian Securities and Investments Commission (ASIC) established an innovation hub to improve engagement with the regtech industry, an area it says has "enormous potential". In February the commission issued tenders for pilots that apply natural language processing to 'regulatory problems'.

The UK's FCA is currently seeking industry input on 'machine executable regulation' and how "technology could make it easier for firms to meet their regulatory reporting requirements and improve the quality of information they provide".

"By working together, we can share best practice in due diligence, experimentation costs, business knowledge and resources – ultimately driving great outcomes for all parties,” King-Jayawardana said.

The cost of compliance for Australian businesses is high and rising. Deloitte Access Economics estimates that federal, state and local government rules and regulations cost $27 billion a year to administer, and $67 billion a year to comply with. According to KPMG, the big Australian banks spent between $350 million and $450 million on regulation and compliance each year. More regulation of the finance industry could be recommended by the ongoing Royal Commission into Banking.

There's also the significant cost of non-compliance. CBA in February earmarked $375 million to pay for penalties arising from legal action launched against it by AUSTRAC, Australia’s anti-money-laundering watchdog.

Big steps

CBA established its London innovation lab in 2017. King-Jayawardana shared the details of two other regtech pilots that are underway. The first involved startup Bloomsbury AI, using its technology to create chatbots from internal rules text. The work resulted in a conversational interface to answer questions around a company gifts and entertainment policy.

Another pilot, done in collaboration with an unnamed startup, used AI to analyse unstructured text from regulators such as speeches and guidance notes, to try and determine future regulatory trends.

"It sounds good in theory but that was a bit of a failed experiment for us. We thought if its unstructured guidance notes, can you understand trends? But there's not enough training data. It's difficult to know this set of material had this impact. It's difficult to train that, while it all looked good in theory it just didn't work," King-Jayawardana said.

Despite some promising initial results, CBA has a “proceed with caution” approach to regtech.

"The reality is I don't think in certain critical areas we would ever go towards no human intervention or oversight, and the same could be said about no external legal oversight," King-Jayawardana said.

"For all organisations, to go from 'we work with a law firm that we know about' to 'now we work with a technology solution that we've run a few experiments on and are fairly comfortable is accurate' – is a big step."