As Westpac Bank recently discovered, keeping a data centre running smoothly can be a complicated thing.
The bank made headlines after an air conditioning outage at its data facilities pulled down automatic teller machines (ATMs) and EFTPOS facilties for hours. At the time, data centre specialist and the director of Adelaide-based Computer Site Solutions, Mike Lockett, said the incident was a timely reminder for Australian businesses to review their data centre’s capacity to cope with a fault.
“It’s vital data centres have additional power and cooling that can be utilised if one of the air conditioning units or power sources fail,” Lockett said in a statement. “We know from experience that many centres just don’t have this capacity and it makes them extremely vulnerable."
According to Gartner Australia research vice-president, Phil Sargeant, Westpac’s problem could have been avoided if they had invested in backup cooling methods “such as liquid cooling or allowing outside air to filter through the data centre.
An alternative method is to place data centre in a cool climate, but machinery such as humidifiers need to be installed in place to move that air.
“You could have a data centre in a very cold climate and open the windows which is kind of a solution, but you still have to get the air around the devices. However, placing a facility in a cooler climate will save energy costs because you don’t need to use as much cooling.”
Alternative energy sources could also be utilised such as wind and solar power and Sargeant says these are a big plus if the company wants to harness green energy and cut down on carbon emissions created by air conditioning. But again, these enegery sources are not 100 per cent reliable.
“Let’s say you are relying on a wind turbine and suddenly that wind turbine stops because there is some mechanical failure,” he says. “Unless you have another power source than you’re dead in the water.”
An alternative to a cold climate is taking advantage of Australia’s numerous oceans and placing the data centre near a water source and using pumps to send cold water around the faciltity.
Just like the price of houses with sea views, Sargeant warns that placing a data centre by an ocean can be “exboritant.” “If you were to utlisie the water, you still have to put the mechanisms in and the pumping units which could be very good but what happens if those pumps fail?”
“Even with other [cooling] methods you’re still going to have failures in the movement of devices,” he says. “What you have to do is put in contingency plans for those outages.”
For example, Sargeant says he was “surprised” that Westpac did not employ a back up data centre or co-location data centre services provided by a third party.
“When you have a bank or a telecommunications company that relies on having a data centre operational all the time, you have to have disaster recovery or mechanisms where you can continue your business,” he says.
“Even if you have a power outage or an air conditioning outage at one site, the work can be quickly brought up at another site and that is surprising to me that [for Westpac] it didn’t happen.” He adds that there is “no doubt” companies that need to be up all the time must have multiple power feeds, air conditioning and redundancy.
While Sargeant doesn’t see cooling methods changing dramatically just yet, energy prices and the Federal Government’s carbon tax proposal may upset the data centre equilibrium cart.
The plan to impose a carbon tax from 1 July 2012, will increase electricity prices by up to $300 a year per household, according to Prime Minister Julia Gillard, and adding that cost to those already experienced by data centre operators, may mean cost becomes the deciding factor in what cooling methods are used.
Sargeant believes data centre managers and organisaitons will be forced to look at energy management in a way they haven’t in the past, such as installing energy management systems to control power and cooling and cut power requirements. Data centres will also move to use new hardware that is capable of running in warmer air.
“I’ve visited some data centres in Australia and they are very cold,” he says. Data centres can now accommodate higher temperatures and the offset is you don’t have to use as much air conditioning.”
And it’s not just Gartner Australia that is pushing for alternative cooling methods. Gartner US chief of research for infrastructure, David Cappuccio, recently said that the reason data centres use so much cooling is that the traditional method of designing data centres dated back to the mainframe era of the 1980s and '90s.
Due to high costs, many mainframes were targeted for average performance in the mid-90 per cent range during production time slots.
"As a result, there was minimal variation in the operating temperature or power consumption during long periods of time,” he says.
Different demands on mechanical and electrical systems such as workload mix meant new designs were needed.
“Zones within the data centre might employ directed cold air, or even in-rack cooling to support very high density workloads with minimal disruption, or impact, on the rest of the floor.”
Hamish Barwick writes about data centres and security for Computerworld, CIO and Techworld
Follow Hamish Barwick on Twitter: @HamishBarwick
Follow Computerworld Australia on Twitter: @ComputerworldAU