The trouble with the Internet age is that activities like sending e-mail, filling in forms, clicking on ads, uploading files or signing up to electronic databases all generate data that has to be carefully stockpiled for safekeeping.
Loads of data; avalanches of it.
Factor in the push to e-business, the huge upsurge in online transaction processing and increased reliance on immense electronic databases and it's easy to see why storage demands are skyrocketing. With the amount of data generated around the world doubling each year, the need to manage petabytes rather than terabytes of data is just around the corner.
Meet the storage area network (SAN), a technology starting to look increasingly appealing to IT managers needing to store and manage vast amounts of information in a high-performance environment. Research company IDC expects the market for SAN to be worth $23.8 billion by 2003.
SANs promise more efficient use of primary and secondary storage devices, better end user productivity, and simplified storage administration, with sharing - that's both storage and data sharing - as a significant driver.
Storage sharing because if more than one computer can access a common pool of storage devices, you can spread the cost of reliable but relatively expensive Redundant Arrays of Independent Disks (RAID) and robotic tape libraries across several computers. In a SAN, storage configuration becomes a simple logical assignment process rather than a significant physical restructuring - and that reduces Total Cost of Ownership (TCO).
At the same time, data sharing lets applications and end users access the same data files. Users share data every day with network file systems over local area networks (LAN), so why bother using a SAN for data sharing?
"The reason is speed," said Lachlan Macdonald, managing director of AustStor. "Multiple computers can concurrently transfer large files at rates comparable to locally attached disks over the SAN without adversely impacting the corporate LAN."
SAN data banks sit at the back of the server, working in parallel to the host networks to allow information to be melded, stored and retrieved in real time.
Because a SAN can contain both primary storage devices like RAID and secondary storage devices like tape, each of which is managed independently, network managers are relieved of the need to create separate physical networks for each.
Better still, with the right software all the devices can be centrally managed as a single entity, making it easier to deal with storage networks containing myriad separate servers and devices.
In the SAN environment, storage devices like DLTs (Digital Linear Tape) and RAID arrays link up to a server or servers via a high-speed interconnection - frequently Fibre Channel, which provides speed, distance, I/O capability and multiprotocol support to storage set-ups.
Fibre Channel was developed with the limitations of SCSI squarely in mind and designed from the outset to be more reliable, scalable and flexible than SCSI. Dataquest expects to see Fibre Channel completely displace both SCSI and SSA for host-to-storage connectivity by 2002.
"Fibre Channel is better for one reason - it's going to be as cheap as SCSI and its distance is extendible," Mark Knittel of Computer Network Technology (CNT) told a Network World Storage Networking Town Meeting in the US earlier this year. "For example, Fibre can run 10,000 metres between devices, whereas SCSI devices can only be separated by about 12 metres."
A typical SAN environment provides any-to-any communication among all devices on the SAN but also provides alternative paths from server to storage device. Thus if one server is down or running at a snail's pace, access to the storage device can be achieved via other servers on the SAN.
When data outstrips storage capacity, network managers need only add additional drives to the storage network, not to any specific server. And adding more storage is as simple as adding drives to the storage network that can be accessed from any point, rather than having to connect them to a specific server.
Acceptance is growing for remote SAN implementations like disk mirroring and tape backup and restore as the best way to meet the need for high availability recovery of mission-critical data. That's because using high-speed interconnections to link servers and storage devices effectively creates a separate, external network that is connected to the LAN but operates independently.
SANs also let you add bandwidth without burdening the main LAN and conduct online backups without squeezing users on bandwidth.
Proponents say SANs provide more options for network storage, making it possible to create separate networks to handle massive amounts of data and giving organisations much faster access than network attached storage (NAS). Given time, SANs will even be able to manage storage across multiple heterogeneous operating systems.
But they aren't there yet, and according to Gartner senior industry analyst Mathew Boon it could be a bit of a wait.
"The beauty of the SAN is that particularly when using Fibre Channel architectures you can have them spread out over wider areas and so they are very good for data safety," Boon said.
"But the way that everyone has touted SANs from the word go is to say they'll get to a point where it doesn't matter what systems you connect, it doesn't matter what storage you have, they'll all share storage and they'll share information heterogeneously. They're just not at that stage yet," he said.
In and by themselves, SANs do nothing to consolidate the islands of storage' attached to individual computers within most organisations - the storage remains dedicated and controlled by individual computer platforms.
What they can do is to allow partitions within storage pools to be accessed by diverse operating environments or platforms.
"I believe they are working quite effectively at doing that, particularly in the enterprise level. EMC has come quite a long way in that regard," Boon said.
But he said the need to manage those storage pools makes software the main differentiator between different SAN solutions.
"That's what you really need to look into: what it will actually do for you once it's implemented, and how easy it is to manage."
One vendor leading the way is Compaq, which recently announced a storage offering that includes SANworks, a line of enterprise storage management software.
IBM has also put its full weight behind SANs, with a $US400 million initiative to harness the explosive growth of data generated by e-business.
And in another advance Computer Network Technology in April announced SAN over IP, which allows companies to build storage infrastructures over IP-based networks such as intranets and virtual private networks (VPNs) and link localised SAN islands' to create an enterprise-wide SAN.
"The ability to perform storage applications over IP has huge implications for the entire market," said Steve Duplessie, an analyst with the Enterprise Storage Group, a storage research analyst. "Using the IP backbone means wide-area access now has the same feel as local, which will change the way people think about storage. The implications for wide-area clustering, disaster recovery, and business continuance are enormous."
Other major players include EMC, Hitachi Data Systems and Network Appliance, which claims to have 46 per cent of the disk-based network-attached storage (NAS) marketplace.
It would surprise many not operating within that environment to learn the storage area network concept has been around since IBM introduced its Escon technology into the S/390 about 10 years ago. However most recent attention has focused on SANs for the open systems environment.
Graham Penn, IDC director storage research of SANs, said: "Those SANs have been on the horizon for about two years, but it wasn't until arguably the second half of last year when we started to get early implementations.
"Within the Asia-Pacific region Australia tends to be ahead of the game, [but] we're lagging 12 to 18 months behind the early implementations in the US."
Early adopters here include Metway Bank, Sanitarium Health Foods, Optus - whose SAN is being run by Compaq as an outsourced environment - and National Australia Bank.
PocketMail, which has set up a network orperations centre in Sydney, provide, high-availability storage for users as well as simultaneous access to both its Unix and NT file servers. It gets business continuity planning support via continuous snap mirroring of the entire file system to its disaster recovery operations centre, while the site architecture supports fast scale-up and flexible storage for data and applications services.
Meanwhile Nortel Networks uses Network Appliance's NetApp Filer for storage of home directories.
David Dodds, Nortel Networks' manager development support, said staff generally need access to their home directories from a range of Unix platforms and also the PC environment.
"We use the NFS (Network File System) and CIFS (Common Internet File System) ability of the NetApp Filer to share out the same directory to the PC and Unix environments.
Users can then save a file in Unix and go directly to their Windows Explorer to open it there. This dual ability provides a highly flexible environment to work in."
Nortel also uses the device as a central repository for all code developed at Nortel.
"The NetApp gives us a wide range of performance increases ranging from 2.5 times increase up to eight times the performance increase, depending on the project in question," Dodds said.
But while those early adopters are making headway, Penn says not all Australian implementations have yet gone organisation-wide; there are also cases where solution providers are building a storage solution based on SAN without telling their customer. It's all about implementing new technologies without scaring those who might be put off by any notion of being on the leading edge.
"They're getting there by default, not necessarily with full board approval and knowledge of how the storage solution has been built," Penn said.
IDC estimates between 10 per cent and 15 per cent of Asia-Pacific organisations currently have some type of SAN in place. That figure should grow to about 35 per cent by 2003, according to Penn, and in some data intensive industries like banking, finance, telecommunications and Internet service provision the figure will be higher.
"There's huge growth in storage, and the fast-growing component will be storage area networks, because they are providing a good answer to a real problem," Penn said.
And for IS managers who have Y2K work behind them finally getting the leisure to consider ways to handle the huge information flow within their organisations, SANs are starting to look like the answer to a question few organisations have until recently even thought to ask, he said.
Perils and pitfalls
As promising as the SAN currently looks as a solution to exponential growth in storage needs, existing SAN implementations are still failing to deliver in the area of standards.
"The various standards that need to be in place to enable you to operate across operating systems and within various vendors are still not quite there," Penn said.
"That makes for some software challenges and also some technical, physical-connect type challenges.
"The software challenges are associated with operating systems, particularly the way Windows NT tried to grab the whole of the environment and almost causes you to reformat a Unix box attached to the same SAN.
So it's at that level that you need to be very careful what you're doing when you've got multiple operating systems in the environment.
"The physical connection challenges are associated with the various Fibre Channel standards being put in place by the various host bus adapter vendors and the switch vendors," Penn said.
He said the issues would be resolved in time, just as they were with Ethernet 10 years ago.
He noted the Storage Network Industry Alliance (SNIA) - an industry group of more than 75 companies working on storage networking standards for improved management - has just accepted an offer from Compaq for a facility somewhere in Colorado to operate an industry-wide interoperability testing laboratory which should make a difference.
While users wait for the standards to be put in place, Penn says most companies might be better off dealing with a single vendor who accepts responsibility for implementing and managing the SAN.
"I'm not saying it can't work when you've got multiple vendors participating, but you could be saving yourself a lot of headaches if a vendor is prepared to take full responsibility."
There also remains a danger of being locked into a proprietary platform.
"If you looked at say Compaq or IBM or Hewlett-Packard or whatever within their enterprise SAN environments, you could well find that you are restricted to their proprietary platforms to some degree," Gartner's Boon said.
Avoiding lock-in looks increasingly important as IT managers grapple with ways to cope with the vast flood of data that otherwise threatens to drown us all.