NAB's chief architect on multi-cloud, data science automation

Sergei Komarov looks beyond daily delivery deadlines to building the bank of the future

Change is the only constant in the high-tech world and technologies now in their infancy will change the way businesses operate and interact with their customers. Here are some of the developments we’ll be hearing more  –  and less –  of, this year.

As chief architect for NAB, I have the privilege to work on building the bank of the future. The bank is investing billions in an ambitious, multi-year digital transformation initiative that will improve our customer experience and allow us to launch new services quickly, economically and safely.

Since joining the technology team last year, I’ve been immersed in a multitude of projects that all contribute to achieving this aim. The exercise has been a bit like finding ways to assembling a multi-dimensional puzzle, a great fun for a techie wanting to get into the nitty-gritty of many complex, challenging and cutting-edge pieces for which there’s no pre-set roadmap.

The other great part of the job is to look a bit farther out than the daily drumbeat delivery deadlines. ‘Switching on the high beams’ to see what twists and turns we may need to lean into in the road ahead is fascinating and fun. Here’s my take on the tech trends du jour.

Advanced multi-cloud

Cloud is now nearly ubiquitously regarded as the most robust and resilient form of infrastructure for large enterprises running multiple mission-critical applications. The next frontier is multi-cloud: not simply using more than one cloud provider to distribute workloads to, but enabling greater portability so workloads are not irreversibly pinned to a single provider. 2019 will see more large organisations looking to address this and more open-source projects and open-core companies looking to address this need. 

At NAB we are implementing pragmatic but ambitious multi-cloud strategy with view of maximising portability without descending to the lowest common denominator, for more on this point see my previous blog post on multi-cloud. As understanding of these complexities matures, I expect to see other companies follow suit.

Data science automation

Organisations’ collective appetite for the insights contained within their vast repositories of data has grown exponentially in recent years. Commentators have dubbed data ‘the new oil’ and in 2012 Harvard Business Review declared data science the sexiest job of the 21st century. I can only concur. Trouble is, our education systems just don’t produce enough mathematics or statistics majors to satisfy the demand for hard-core data science.

I’m expecting a rise in data science automation  –  the development of AI-driven tools that can take care of more of the preparation work like data cleansing, feature generation and automation of model runs, leaving data scientists to concentrate on the higher value-added activities. The ability to automate generation and testing of models, tune approximations for explainability, applying generic algorithms to features and models, dealing with model ensembles would take out the need for a data scientist to spend days and weeks on basic tasks to achieve baseline model performance. Instead, they can start from that baseline and apply their efforts to refinement where insight, experience and judgement of an expert yield the greatest benefit.

Fast data

Big data has long been a byword but ‘fast data’ is rapidly gaining mindshare. In today’s connected world, trends emerge, amplify and fade much faster than they did in the past. Customers expect every interaction with a company to be informed by data that company has, without having to re-state what should already be known. 

Instead of batch analytics-driven hindsight, just-in-time insight and foresight are where the premium increasingly lies. That means the traditional big data approach of amassing a mountain of data and analysing it periodically is less useful than being able to glean insights from the information that’s streaming in, in the here and now. I expect to see much greater adoption of tools enabling organisations to do this at scale, first augmenting then gradually supplanting some of the traditional analytics.

The bursting of the bitcoin hype bubble

Blockchain and bitcoin. One is distributed ledger technology, the other a cryptocurrency based on blockchain, but their fortunes in terms of perception and the high hopes placed on them have been entwined since they moved into the collective consciousness of companies and consumers four or five years back. Hence, with the spectacular crash of bitcoin over the course of 2018, we’ve seen mounting scepticism and disenchantment around the technology it’s built on. My hunch is this won’t change any time soon. There’s a growing realisation that blockchain remains a niche solution in search of the perfect problem for it but, like everyone else, I’ll be watching this space with interest.

Where to now for chatbots?

The rise of robotic automation technology has resulted in a rash of chatbots  –  rare is a company that doesn’t have one, at least in the works  –  but there’s a big difference between doing something and doing it well. It’s easy enough to create a chatbot that looks cool in a lab but actually developing a service that isn’t a glorified interface to a website search, that isn’t terribly annoying for end users, that provides real value… not so much. Until we can teach them to recognise context and cope with nuance  –  fundamental aspects of human language and communication  – most will continue to deliver interactions that are not much better than commonly disliked IVR (interactive voice response) robotic menus answering help lines.

Moving beyond the AI hype

The hype about AI (artificial intelligence) has outpaced its fundamental data science substance  – classification and pattern matching  –  but as the hype bubble deflates, we’re seeing some of the ‘next order’ challenges associated with this technology come to the fore. Chief among them are explainability, bias compensation and the broader ethical issues.

I anticipate these will become critical concerns as AI is integrated into more of business processes. Human beings won’t wear the idea of unaccountable algorithms making decisions about issues like credit worthiness, insurance premiums, or even life and death unless we find cogent ways to explain the ‘thinking’ behind them.

Event sourcing patterns go mainstream

Event sourcing is an architectural pattern whereby the events in a system are recorded in an immutable, strongly-ordered sequence. The resultant event log can be used to replay, reprocess and recover transactions, perform audits and identify the exact sequence of changes that led to a particular state. Event sourcing pattern is hardly new, and while clearly useful, especially in regulated financial institutions, it hasn’t really seen a widespread adoption as yet. Increasing availability of component technologies making it easier to implement this, especially in the cloud, should finally allow greater adoption of event sourcing.

About the author: Sergei Komarov is chief architect at NAB. Before working at NAB, Sergei held senior positions at Barclays and PayPal.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags financetrendsNABbanking

More about AdvancedCustomersNABPayPal

Show Comments
[]