WAN optimization in a large enterprise is a nascent and headline-grabbing market. Still, IT managers are faced with the burning question: Is it really worth the investment?
The technology uses techniques such as compression, caching and de-duplication to optimize traffic, without actually making the networks run faster. Whether it's a panacea for network ailments is open for debate, and analysts are at turns optimistic and mildly skeptical.
"WAN optimization addresses two classes of problems -- scarce bandwidth and high latency," says Joe Skorupa, a Gartner research analyst. "This often reduces the average number of bits transmitted by more than 90 per cent [for data patterns that have been seen before]. However, this can be misleading, since new data patterns are reduced by around 50 per cent, and compressed or encrypted files see no reduction."
Skorupa continued: "[Providers] add QoS features to deal with bursts so high-priority traffic can get through. Yet, even in these cases, link loads that consistently top 65 per cent to 75 per cent cause problems that typically require bandwidth upgrades." He said this is because networks running at an average of 65 per cent to 75 per cent capacity often have short bursts that go well beyond that level. That leads to unpredictable response times because there is no spare capacity to absorb the bursts.
WAN optimization to the rescue
Specialized appliances from companies such as Riverbed Technology and Citrix Systems address issues such as unresponsive apps, slow transmissions and network congestion.
Managed application providers such as AT&T and BT Global Services perform the same duties and charge a fee for carrier service for the network pipeline that connects remote offices and retail stores, for example.
Interestingly, there are dramatically different reasons why large companies would choose to optimize WAN traffic, depending partly on their markets, infrastructure and security concerns.
For example, at Solutia, a chemical manufacturer in St. Louis with about 6,000 employees and annual revenue of about US$4 billion, an application performance problem was a result of quick expansion, ongoing server and data center consolidation efforts, and what looked remarkably like start-up costs -- big upfront payments for new branch offices to access home office services -- to gain more network capacity, a peculiar concept for a 100-year-old company.
"We were getting to the point where people were complaining, and we were seeing more and more dropped packets on the network," says Harold Byun, a senior product manager at Solutia. "Adding a T1 to a branch office with 700 people costs $1,000 per line. It was a better option for us to push more data through the pipe but use WAN optimization to do it, which compresses data at a rate of 59 per cent to 75 per cent."
Solutia is mostly optimizing HTTP traffic for its SAP deployment, although it's also using de-duplication techniques for file transfers, where data is cached for files that were transmitted in a previous session. De-duplication is the same technology that companies such as NetApp use when making backups of virtualized servers to speed operations and use less storage.