Machine learning is the new face of enterprise data

AI isn't just for Hollywood anymore: Putting artificial intelligence into the cloud has made it available to anyone

While the complexity of the searching and result-ranking technology behind Apple's Siri would likely elude most of its users, the value of a context-sensitive personal assistant certainly has not. Yet while Siri spawned a new generation of anthropomorphic digital assistants, researchers in machine learning and artificial intelligence (AI) are taking the concept much further to help enterprises catch up to the growth of data.

Industrial products distributor Coventry Group is among the latest companies to jump onto the trend. The company, whose fasteners, fluid systems, gasket and hardware divisions collectively employ around 650 people, is working with Adelaide-based data-analytics specialist Complexica to apply that company's AI technology – personified as Larry, the Digital Analyst – to guide decisions around sales and pricing strategies.

Introducing Larry – a collection of algorithms delivered on a software-as-a-service (SaaS) basis via Amazon’s cloud – to Coventry's business is a two to four month process that will see the technology finetuned to the company's operating parameters.

Once it starts going, however, Larry will begin a process of ‘learning’ that Complexica CEO Matthew Michalewicz says will help deliver real-time insights and decisions for the company.

“Companies are trying to make data-driven decisions and a lot of value can be generated by using AI to automate the analysis of very large data sets,” he explains. “The most value can be delivered to companies that either have a lot of customers or a lot of products – 2000 to 3000 SKUs or more.”

Larry analyses the business to analyse a range of operational parameters on an ongoing basis, and to make decisions around sales and marketing strategies. A process of guided trial and error, with regular evaluation of outcomes, allows the system to evaluate previous decisions and factor the outcomes into its strategic recommendations.

“We are really targeted on business outcomes – revenue and margin specifically – and looking at the number of levers you can pull to influence those,” says Michalewicz, who founded the company with his father – SolveIT Software founder, analytics researcher and University of Adelaide academic Dr Zbigniew Michalewicz.

“Once you go live, you have automated answers to workflow decisions and idiosyncrasies that are unique to the business. Larry overlays data, looks for correlations, does regression analysis and – by evaluating the outcomes of previous decisions – the system gets smarter with time.”

Researchers were teaching computers to ape humans long before HAL 9000 made space travel creepy in 1968, but technological advances – and, in all likelihood, the ubiquity of Siri – have fuelled a resurgence in AI tropes through human-like personae like Her’s Samantha and Ex Machina’s Ava. All owe a debt to the 2011 game-show victory of IBM's Watson technology, which revived interest in machine learning and its promise to revolutionise the processing of data.

The contemporaneous emergence of big data – which has surged over the course of this decade thanks to improvements in storage, accessibility, and analytics technologies – has given machine learning new raison d'être and promised businesses an entirely new level of insight from the data they are amassing. If unwieldiness is a side effect of the big-data revolution, AI is emerging as the antidote.

Buoyed by Watson's victory in the US game show Jeopardy!, IBM has led the charge to bring machine learning and analytics to the mainstream, investing US$1 billion (A$1.39b) to establish its Watson Business Group and fill it with engineers working to find new, commercial applications for a system so resource-intensive – it is built on ninety IBM Power 750 servers, each with eight CPU cores and a total of 16TB of RAM – that few businesses could actually consider implementing it.

Cloud-computing models helped IBM democratise Watson through an as-a-service model that has attracted support from resellers such as KPMG as well as business users such as Woodside and the Department of Immigration and Border Protection. Business users are abuzz with the possibilities of applying AI to their data stores and in February, IBM has offered a US$5 million (A$6.95m) prize for the biggest breakthroughs made using Watson by 2020.

IBM is far from alone in its quest to apply machine learning techniques to the enterprise: in March, Google's AlphaGo artificial-intelligence engine beat Go champion Lee Se-dol in a series of matches that is driving machine learning-based analytics and has pundits scratching their heads and technologists rubbing their hands in glee at the possibilities.

In a recent speech to the American Chamber of Commerce, Telstra CEO Andrew Penn recently pegged AI as the “most significant driver of technology innovation” and noted its “profound” implications.

“Advanced algorithms in conjunction with the massive increase in computer power, as Moore’s law enters its 50th year, mean computers can now see and hear better than humans,” he said, noting the opportunities driven by the confluence of data, mobility, IoT, computing power and advanced algorithms. “The world ahead will be full of products and services that are intelligent and able to learn our preferences, interact with each other, the cloud and other devices that we have.”

All kinds of companies are now working to make this vision into a reality. Accenture recently announced that it will build a range of virtual-agent services based on IPsoft's Amelia AI platform; Google is revamping AlphaGoas its Cloud Machine Learning service, with speech-transcription and translation among the services; SalesForce.com bought AI specialist MetaMind; Amazon Web Services already offers its Amazon Machine Learning architecture; Microsoft has built Azure Machine Learning into its Cortana Intelligence Suite; and Dell is refining its Automated Full-Time Equivalent (AFTE) technology to speed processing of repeatable, rules-based tasks for a market segment that has come to be known as robotic process automation (RPA).

The flurry of interest in cognitive computing – which research firm IDC has pegged as exploding by 55 per cent annually to be worth US$31.3 billion by 2019 – reflects the corporate hunger for extracting new value from transactional data stores and Internet of Things (IoT) sensor log data that, surveys such as the Verizon IoT State of the Market report suggest, is largely being underutilised due to the complexity and sheer volume of data involved.

“Everything we do and interact with is now creating digital streams of information,” says Evan Stubbs, chief analytics officer with analytics provider SAS Australia and a board member of the Institute of Analytics Professionals of Australia. “This is being captured into a massive corpus of information that – if you want it – it's there.

“The question is economics around how much we can feasibly store. Virtualization and cloud give us the ability to scale algorithms to use as much computing power as we need to throw at them – and you have the scale of computing capability to actually do something with machine learning.”

Those possibilities have quickly trickled into the enterprise world as massive new data sources – particularly the push to collect massive volumes of security data for threat-intelligence analysis, desire to derive patterns from massive transactional data and the need to store and manage a flood of Internet of Things (IoT) data – lend immediacy to the need for real-time data analysis.

Given the large amount of heavy lifting that AI requires in the background, the confluence of the technology with cloud and virtualization capabilities has expanded its possibilities dramatically: Paired with the ubiquitous sensors of the IoT world, says Stubbs, “suddenly you've got a whole bunch of low-cost devices that can start making decisions on their own and co-ordinating activities.”

“You have the ability to collect enough information to power machine learning, and the scale of computing capability to actually do something with that. The Internet of Things gives the world around us the ability to make decisions for us. This is the cornerstone of large-scale societal and economic disruption – and this disruption is not hypothetical. It's something that will change the world as we know it.”

That’s a tall order but years of brainstorming about the possibilities of AI are closer now to becoming reality than ever before. Research firm Gartner has been expounding on the possibilities of what it calls the ’algorithm economy’ for some time, and today's AI systems represent the latest in what is likely to be a long series of iterations that will refine process knowledge and analysis to the point where such systems – whatever the name they are known by – become the key to balancing out the analytical conundrum imposed by big data.

The list of potential applications is as eclectic as it is enlightening – and it is expanding quickly. “Whatever we've seen in technology over the last 20 years, my gut tells me that we haven't seen anything yet,” Michalewicz says. “What we're going to see over the next 20 to 30 years might surprise and scare us at the same time.”

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags data analyticsIBM Watsonmachine learningcognitive computing

More about AdvancedAmazon Web ServicesAppleCoventry GroupDellGartnerGoogleKPMGMicrosoftSASSolveIT SoftwareUniversity of AdelaideVerizon

Show Comments
[]