Dealing with enterprise storage in the zettabyte era

Growing interest in IoT and efforts to transform into data-driven organisations have combined to test the limits of enterprises’ storage infrastructure

The annual World Backup Day, which came and went on March 31 without much fanfare, is designed as a reminder for good data hygiene. Yet for companies where backup is not only good hygiene but crucial to the survival of the business, each successive event is a stark reminder that the overwhelming flood of data is only getting stronger.

Trying to count the amount of data the world generates has become as helpful as trying to count the grains of sand on the world's beaches – but that doesn't stop storage vendors, analysts and prognosticators from trying.

Many rely on the Cisco Visual Networking Index (VNI) – an ongoing survey of data-consumption trends – that suggests that annual global IP traffic will pass 1 zettabyte (1000 exabytes, or 1 billion terabytes) and double this by 2019. At that point, collective global IP traffic will pass 511 terabits per second.

Mobile traffic is also exploding, with the VNI suggesting that mobile data traffic alone grew 74 per cent last year to reach 3.7 exabytes (3700 petabytes) per month by the end of 2015. Extrapolating from current trends, Cisco suggested that this would reach 30.6 exabytes – and that's just mobile traffic

Yet conventional movement of data to disk and from disks and mobile devices is only the tip of an ever-growing iceberg. Consensus suggests that the emerging Internet of Things (IoT) will cause further headaches as an ever-expanding array of smart devices, sensors, monitors, cameras and other equipment rain torrents of information onto enterprises that will struggle to keep up with it.

Cisco’s VNI noted that 97 million wearable devices generated 15 petabytes of monthly traffic in 2015 alone, and this is expected to explode over time as growing adoption of IoT, machine-to-machine (M2M) and mobile devices pushes the number of mobile-connected devices to 11.6 billion.

Robert le Busque, regional managing director for operations and strategy with Verizon Enterprise Solutions, blames the rapidly expanding IoT for creating “a significant multiplier in terms of the amount of data collated, organised and interpreted.”

Businesses need to “start thinking more critically and actively about how you would take that information and have it contribute to the value-based applications within your organisation,” he adds. “Having it drive meaningful decisions for consumers, enterprises, or governments is the key. We are at the very beginning of the runway.”

Before it can drive those decisions, however, data needs to be collected, stored, and organised so it can become part of the ongoing enterprise strategy rather than an anomalous pool to be dipped into occasionally. Verizon, along with vendors like LogMeIn, ThingWorx, and Oracle, is aiming to address this requirement with IoT management platforms that include analytics elements to keep customers on top of the flood of new data as it is generated and collected.

Businesses are already feeling the pinch from the flood of data, with the recent Verizon-Oxford Economics State of the Market IoT 2016 report finding that 92 per cent of businesses are using less than a quarter of the data their IoT environments produce. Three years from now, Verizon believes, even the most proactive IoT adopters will on average be using just 48 percent of that data.

Storage in the zettabyte era

The widespread ‘sensoring’ of the enterprise will push businesses from long-established IT-based models of operation to a ‘digital business model’ based on scale-out architectures and commodity hardware, according to Gartner research vice president Roger Cox.

Cox, who presented a five-year storage scenario for enterprises at the company's recent Infrastructure, Operations & Data Center Summit, says the digital business model will not be achievable by maintaining the status quo due to data growth, budget restrictions, increasing SLA demands, and infrastructure complexity.

Read more: Hyperconvergence vendor SimpliVity appoints ANZ sales lead

The emerging digital business model will be largely software-based and driven forward by nimble innovators rather than the previous monolithic hardware-based data environments of large vendors. This transition will force CIOs to manage a 'bimodal IT' operating model in which inward-focused operators focused on storage safety and reliability are increasingly mixed with externally focused staff focused on storage agility, scalability and flexibility.

By 2017, Cox believes, three-quarters of global businesses will have tried to implement a bimodal IT organisation but half “will struggle and go through multiple attempts before reaching a working state”.

“People have the skill set but it's also very much a learning process,” he explains, noting that many companies will need to invest heavily in retraining to both unlearn old habits and to learn the new ones necessary to provide appropriate storage skills. “The personnel acquired to develop and manage the digital business model are more expensive than those required to support and manage traditional infrastructure.”

Growth in the generation and usage of data poses issues far more complex, however, than simply teaching people to use increasingly esoteric prefixes to describe their data usage. Life in the zettabyte era will increasingly be characterised by enterprises that are almost drowning in the data they have generated – and struggling to make sense of what data to keep, what data to lose, and what data provides the most business value.

Read more: Local Lenovo boss sees big potential in hyperconverged

Backing up zettabytes would seem to be out of the question – but the very threat is enough to keep many CIOs awake at night. Whereas even multi-terabyte data centres were considered a novelty a decade ago – when the fabled overnight backup window was still achievable under the right circumstances – big companies are now scaling into the petabytes and backup is a continuous process with no end and no beginning.

This change is challenging conventional notions of backup and marginalising the role of physical media, which for decades provided a reliable method of backing up masses of data. The densest Blu-ray disks now cram 300GB onto a write once read many (WORM) disk, while the latest iteration of Linear Tape-Open (LTO) backup tapes – LTO-7 – cram 6TB of raw data onto a cartridge; transfer speeds of 300 megabytes per second second are capable of storing a maximum of 1TB per hour.

Such technologies, once-ubiquitous, rarely come into conversations about storage in large enterprises – and it's not hard to see why. As data volumes under management grow, it doesn't take long before the maths around conventional methods of enterprise storage quickly break down.

And while solid-state arrays offer some improvements – they can cut administration costs by 48 per cent, power consumption by 76 per cent, space demands by 63 per cent and maintenance by 16 per cent, according to Gartner figures – they will be just one of several solutions for CIOs forced to think differently about their infrastructure.

Your data, their cloud

Backups in large environments have historically been thought of in the context of hot-hot data centre configurations. However, with workable and effectively infinite cloud storage available at low cost, CIOs now have more options than ever for dealing with the data deluge.

For managing truly large data sets, however, a growing numbers of CIOs are turning towards increasingly mature cloud-storage environments that have rapidly grown from Dropbox-sized 5GB partitions for individual users, to virtually unlimited repositories offered to businesses on a per-GB basis.

Use of cloud-based storage environments offers one key benefit that will be of immediate appeal to data-deluged CIOs: The scalability and inherent redundancy of at-rest storage services like Amazon Web Services' Glacier, Google's Nearline Storage and Veritas' Enterprise Vault.Cloud make them viable options for dumping large backups and masses of data that you simply can't bear to discard.

There are other advantages for shoving your data into the cloud: Because cloud providers bundle the entire cost of storing, backing up and managing data into a single cost per gigabyte or terabyte, enterprises adopting cloud storage quickly gain a measure of cost predictability that has long been extremely hard to pin down.

“The person in charge of the storage budget can now understand what they're spending their money on,” says Scott Meddings, APJ regional architect with storage giant Veritas, which like many of its rivals has been pushing into the cloud arena as part of a diversified storage strategy.

“We were never able to do this before; it hasn't really been possible to deliver until the cloud, and that new transparency to the cost of data is what's driving adoption now. The amount of data that we’re going to see over the next five years is going to be a lot more, and managing it is going to be a core cost for a lot of organisations.”

Little wonder that, despite the explosion in consumption of data, hard drive makers are reporting slowing revenues – Seagate Technologies posted a US$21m quarterly loss for the quarter ending April 1 despite introducing record-sized 10TB helium-filled drives – as diversification of storage strategies and a shift to flash-RAM arrays drives enterprises away from exclusive reliance on spinning disks.

IDC attributed such flagging results to an overall 14.4 per cent slowdown across the enterprise storage systems market, with server-based storage climbing 6.1 percent during the quarter in volume terms but the larger external storage systems market declining 2.3 percent year on year. Surging sales of flash-based storage systems, which grew 71.9 percent year on year, were also noted.

“The enterprise storage market closed out 2015 on a slight downturn, as spending on traditional external arrays continues to decline,” IDC Storage Systems research manager Liz Conner said in a statement. “Over the past year, end user focus has shifted towards server-based storage, software-defined storage, and cloud-based storage. As a result, traditional enterprise storage vendors are forced to revamp and update their product portfolios to meet these shifting demands.”

Performance is still limited by telecommunications capacity, but the increasing intelligence of storage subsystems is turning cloud storage into another tier of the traditional information lifecycle management (ILM) paradigm – supplanting tape as a long-term storage method for regular backup and infrequent but critical data recovery when needed.

In the long term, IDC says, enterprise storage will drive 45 per cent of petabyte demand by 2018 – and its inexorable shift towards the cloud, where providers are swilling storage capacity at prodigious rates. Cloud IT-infrastructure spend grew 21.9 per cent year-on-year in 2015, according to IDC, which says spending will grow 18.9 per cent this year – including 12.4 per cent growth in storage spending as public-cloud providers dominate infrastructure spending in coming years.

The key to surviving the data deluge, Cox says, is keeping open to the possibilities it presents. “Users need to expand their horizons in considering who provides their storage infrastructure and their IT infrastructure in general,” he says. “Looking at other opportunities can give you an infrastructure that is more flexible, agile and lower cost than simply buying from the same vendors that you've been buying from for the last five, 10, or 20 years.”

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags storage

More about Amazon Web ServicesCiscoDropboxGartnerGoogleLogMeInOracleSeagateVeritasVerizon

Show Comments
[]