The Internet of Things? What matters is ‘The Analytics of Everything’

The IoT won’t stop growing at 20.8 billion connected objects, and the data won’t slow down

A challenge only to the imagination at the outset, IoT – the Internet of Things – is now reality and growing daily. Initial excitement about its potential to reduce cost and speed and simplify process has been overtaken by enthusiasm to test the limits of its enabling technologies.

Some aspects of the IoT are hugely advantageous at the enterprise level – think of smart energy metering by utilities – while others are increasingly automatic for consumers. Making a purchase by waving your smart phone and then – with the same device and more-or-less instantly – learning of its impact on your bank balance, is fast becoming commonplace.

Gartner has forecast that by 2020 there will be 20.8 billion ‘connected things’ in a vast network of devices, appliances, vehicles and more. And as with most new good ideas maturing over time, the pace of uptake will vary. In the early IoT days, pundits spoke of refrigerators alerting replenishment needs. But how useful is that today if I am unlikely to buy a new – and therefore ‘smart’ – fridge for another few years?

On the other hand, am I prepared to give responsible others my medical history and other confidential personal details if doing so might save my life one day? Of course I am.

While having everything connected is convenient, speeds activity and lowers cost, arguably far greater value is to be had by taking advantage of the massive volumes of data the connections generate. We are moving ahead from the Internet of Things to the ‘Analytics of Everything’. Today’s smart software and increased processing power – together with cost effective sensors and economic wireless networks – are capable of delivering insights in near real-time.

The benefits are significant. For example, marketers of fast moving consumer goods can analyse the streaming data exchanged between sensor-laden elements in the retail environment, and across social media, to make pricing decisions for immediate impact. In another sector, fleet managers can get early warning of likely mechanical failure from on-board analytics to make timely rescheduling and maintenance decisions.

And distributors and manufacturers can check that automation is behaving to spec, and fine tune supply chains to quickly address unexpected supplier delays or suddenly increased demand. For utilities, airlines, ‘smart cities’, security services, resource companies and more, analysing streaming data and combining it with other relevant information from both inside the organisation and beyond – and especially at the network edge – will offer extraordinary value.

So where and how do we get started on analysing everything of relevance?

First, we have to be able to collect, manage and listen to the noise of massive volumes of data from all relevant sources – sensors, devices, business systems, social media and more. We then need to learn to detect the signals in that data that will point to what should be analysed. This calls for the use of advanced predictive analytics and machine learning algorithms, either when data has reached its repository or while still in transit.

Analysing data while it is actually in the process of being generated reveals patterns of importance in mid-build. In-stream analytics is invaluable for fast remedial or opportunistic action, whether taken automatically or by human intervention. At one extreme, real-time response could mean avoiding costly loss by detecting and preventing the pending failure of critical industrial equipment. At another, it could help guarantee public safety in a mass transportation environment.

Second, we have to be able to make sense of the signals uncovered. To do this we need processing capacity that can handle fast moving data – including dirty data and data from widely disparate sources – with a comprehensive set of advanced analytics tools. Tools that enable visualisation and hypotheses, and which we can test with statistics and predictive analytics, to create models and manage them into production.

Importantly, what is delivered must be customised for the users, whether in IT, in the lab, out in the field or on the shop floor. End user functionality needs to range from basic descriptive analysis, for which visual analytics can be particularly valuable, through advanced predictive modelling. However, nothing should be deployed in isolation. The complete analytics of everything infrastructure must be integrated for uniformity and to better facilitate future development.

And lastly, back to that word ‘integrated’. Whether we are looking at data management, analysis, visualisation or any other aspect of the full analytics cycle, the analysis of Internet data – just like traditional data analysis – is best deployed on a single platform, rather than piecemeal. The IoT won’t stop growing at 20.8 billion connected objects just a few years from now, and the data won’t slow down. In addition to being able to cater for the complete end-to-end process, platforms for The Analytics of Everything must be able to scale.

David Bowie is the Managing Director of SAS Australia and New Zealand.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags sasInternet of Things (IoT)

More about GartnerSAS

Show Comments
[]