WAN optimization: Better than a 'real' upgrade?
- 11 August, 2008 08:43
WAN optimization in a large enterprise is a nascent and headline-grabbing market. Still, IT managers are faced with the burning question: Is it really worth the investment?
The technology uses techniques such as compression, caching and de-duplication to optimize traffic, without actually making the networks run faster. Whether it's a panacea for network ailments is open for debate, and analysts are at turns optimistic and mildly skeptical.
"WAN optimization addresses two classes of problems -- scarce bandwidth and high latency," says Joe Skorupa, a Gartner research analyst. "This often reduces the average number of bits transmitted by more than 90 per cent [for data patterns that have been seen before]. However, this can be misleading, since new data patterns are reduced by around 50 per cent, and compressed or encrypted files see no reduction."
Skorupa continued: "[Providers] add QoS features to deal with bursts so high-priority traffic can get through. Yet, even in these cases, link loads that consistently top 65 per cent to 75 per cent cause problems that typically require bandwidth upgrades." He said this is because networks running at an average of 65 per cent to 75 per cent capacity often have short bursts that go well beyond that level. That leads to unpredictable response times because there is no spare capacity to absorb the bursts.
WAN optimization to the rescue
Specialized appliances from companies such as Riverbed Technology and Citrix Systems address issues such as unresponsive apps, slow transmissions and network congestion.
Managed application providers such as AT&T and BT Global Services perform the same duties and charge a fee for carrier service for the network pipeline that connects remote offices and retail stores, for example.
Interestingly, there are dramatically different reasons why large companies would choose to optimize WAN traffic, depending partly on their markets, infrastructure and security concerns.
For example, at Solutia, a chemical manufacturer in St. Louis with about 6,000 employees and annual revenue of about US$4 billion, an application performance problem was a result of quick expansion, ongoing server and data center consolidation efforts, and what looked remarkably like start-up costs -- big upfront payments for new branch offices to access home office services -- to gain more network capacity, a peculiar concept for a 100-year-old company.
"We were getting to the point where people were complaining, and we were seeing more and more dropped packets on the network," says Harold Byun, a senior product manager at Solutia. "Adding a T1 to a branch office with 700 people costs $1,000 per line. It was a better option for us to push more data through the pipe but use WAN optimization to do it, which compresses data at a rate of 59 per cent to 75 per cent."
Solutia is mostly optimizing HTTP traffic for its SAP deployment, although it's also using de-duplication techniques for file transfers, where data is cached for files that were transmitted in a previous session. De-duplication is the same technology that companies such as NetApp use when making backups of virtualized servers to speed operations and use less storage.
Page Break
With WAN optimization, if a business user sends a large PowerPoint file to a branch office and the recipient changes just one slide, the entire file is not retransmitted back from the branch office to the user -- only the changed data.
In some ways, the Solutia example reveals how the term "WAN optimization" is a misnomer, because optimization often implies acceleration: If you optimize a sports car, it goes faster. Even though the companies that make the appliances usually talk about statistical value -- increasing performance up to 32 times, in some cases -- the products aren't actually changing the speed of the network.
Yet the compression alone is often worth the investment, which typically runs just a few thousand dollars for an appliance that optimizes a 3Mbit/sec. to 5Mbit/sec. connection. (Optimizing a 100Mbit/sec. pipeline can cost well over $10,000 for just one appliance, however.) That was the case at Activision, a well-known game publisher, which has three district offices, 13 development studios (some in countries such as Japan and Australia), several sales and marketing offices ranging in size from six to 200 people each, and a corporate office.
The issue Activision faced wasn't directly related to application performance -- although the company recently conducted its first WAN optimization test from the US to London for an Oracle 11i application. Instead, the company that made Guitar Hero 3 and Call of Duty 4 was struggling with network latency.
According to Thomas Fenady, a senior IT director, the process of developing games was hampered by some harsh realities of WAN networking. With each game build ranging in size from 4GB to 12GB each, employees in remote offices had to wait about eight hours to receive the latest files, sometimes watching movies or heading home for the day until they could start working on the latest version of the game.
"We blamed the problem on two issues," says Fenady. "One is just the speed of light, which we could do nothing about. Even over a 35Mbit/sec. or 45Mbit/sec. connection from Santa Monica to Dublin, we saw latency go from 100ms to 180-200ms to 250ms or higher. The other issue was TCP inefficiencies [where the nature of the Transmission Control Protocol causes the transfer speed to throttle down as congestion occurs]. TCP throttling makes a connection drop down slowly when you lose packets. With WAN optimization, those same game-builds now transfer in about 15 minutes instead of eight hours." The transfer times were reduced so dramatically because WAN optimization weeds out congestion and latency problems and therefore helps reduce TCP throttling.
Fenady noted that although network latency wouldn't have hampered game development (because the company would have found a work-around in order to make release dates), the optimization makes the development process more fluid and less of a waiting game. He says the big challenge with optimization -- one that separates one vendor from another -- is that it's easier to optimize a T1 or T3 connection, but a line that runs at 45Mbit/sec. or higher is more difficult to optimize. This is because the much faster networks transmit data so rapidly that it's difficult to quickly analyze which data can be compressed, which data is encrypted and which data can be de-duplicated.
Page Break
Hosted or in-house?
One decision facing IT managers is whether to use in-house WAN optimization appliances that sit on the internal side of the routers in the data center, or appliances that sit just outside of the company walls -- operated by a managed services provider such as AT&T, Orange, Vanco, BT Global Services or Verizon.
Interestingly, the exact same appliances -- such as the Riverbed Steelhead -- are often used in-house and by service providers. One advantage of service-provided optimization is a lessened management burden. According to Gartner's Skorupa, many companies choose a service provider because they can't keep pace with changes in application development techniques, ERP releases and security, for example. The WAN optimization appliance itself may need regular updates as well.
Yet a managed service isn't always the best solution. Activision decided to use an in-house Riverbed appliance because it could easily switch carriers -- depending on who has the best plan and offers the best speed for its offices.
If Activision switches to a carrier that doesn't offer optimization, its strategy doesn't have to change. The company can pick carriers by speed and price, not optimization services.
In some markets, such as health and finance, a managed provider isn't a good option for another reason: Because data is optimized in an unencrypted state, privacy and security concerns arise. But vendors such as Riverbed, Juniper Networks and Blue Coat Systems can serve as a trusted "man in the middle" for optimizing data encrypted with SSL, which is commonly used in applications with Web interfaces and other Internet traffic. They terminate the encrypted session, decrypt, optimize and then re-encrypt and forward the traffic. Skorupa said most vendors are developing this useful capability.
In the end, WAN optimizing is here to stay as a long-term solution that is still proving its value, observers say. More and more enterprises are consolidating their data centers, so network speed is a critical factor. Whether in-house or managed through a service, experts say the compression techniques will only get better -- and the results will pay dividends for IT executives on a mission to run smoother operations.
John Brandon is a freelance writer and book author who worked as an IT manager for 10 years.