In recent years, big data and artificial intelligence (AI) have received overwhelming attention, however the interesting – even obvious – connection between the two hasn’t often been explored. It is the combination of big data and AI working together that is now enabling business leaders to deliver new insights, efficiencies and even new functions that haven’t been possible before. This is evident in the increasingly useful role big data and AI are playing in a broad spectrum of traditionally outsourced functions such as recruitment, HR, finance and supply chain, through to security and IT.
The combination has already had a major impact on how the stock market works, synthesising more and more data to the point at which some people believe it will eventually be able to accurately predict both market trends and human influences on the market. At a more everyday level, Google uses deep learning to recognise objects in images, AI is the technology behind Facebook’s Deep Face friend tagging feature and machine learning is the basis for Amazon’s recommendation engine.
To make matters more interesting, the growing Internet of Things (IoT) industry is changing the way businesses, governments and consumers interact with the physical world. Gartner has forecast that by 2020 there will be some 20.8 billion consumer and enterprise connected devices, while IDC sees an IoT market of $7.1 trillion in just four years. The consequent exponential leap in data produced by these inter-connected devices can leave enterprises reeling – or reaching out to the experts for guidance. Enterprises need to view big data and AI at a more strategic level – not just as back office enabler of efficiency but as the key to intelligent decision-making and drivers of new approaches to operational effectiveness. It has become a C-level imperative that cannot be ignored.
If we think of automation as a mechanism to instrument a business, then big data and AI are the tools to accelerate the delivery of business outcomes. This loosely translates into doing more with a lower cost to serve. Operational data can and should be effectively collected. Organisations need to be able to store and report against it in real time, and look at it from an analytical point of view. This approach is not new, and this sort of activity has always taken place to varying degrees. However, the main difference now is that an organisational mindset should be predicated on the principles of big data to capture everything useful, and then make use of analytical approaches to distil out key insights across the breadth and depth of a business to improve operational excellence and forward planning. Machine learning is rapidly becoming a necessity to make this happen – and the volume and complexity make it extremely hard for businesses to continue analysing data in traditional ways and remain competitive.
It’s also worth bearing in mind the need for the right skills sets. It’s important to remember that correlation does not imply causation and whilst assisted by big data, analytics and machine learning techniques, it is also important to note that domain expertise is a critical part of expanding the use of big data and AI to drive new benefits. At an operational level, there is a move away from processing to teaching, as intelligent assets learn how to execute processes. This transition of skills is an essential ingredient in maximising benefits.
We need to draw parallels from the disruptive impact that big data and AI can have on business models. They are driving an evolution in the way outsourced services are provided. This evolution will lead to better outcomes for the buyer and provider, but there is still work to be done on both sides. The opportunity for doing things differently is significant, from better management of operational processes through predictions of future demand. The challenge will be to adopt such techniques, not as a sideline activity but as an essential part of the core business.
Frank Windoloski is Vice President, Insights & Data, Capgemini Australia & New Zealand