Sydney Uni crunches Ebola data with 1512-core HPC setup
- 10 June, 2015 15:42
The University of Sydney’s High Performance Computing (HPC) service dubbed Artemis is currently being used to crunch data about the Ebola virus.
Professor Edward Holmes told Computerworld Australia that he began the research in 2014 when the outbreak began in West Africa.
“We can sequence the genome material of the viruses and compare those genome sequences using computers. Those comparisons tell us about how the virus is spreading through populations,” he said.
“What that is telling us is the incredible speed- almost in real time- at which the virus is spreading. The HPC is allowing me to do that [research] 10 times faster. It’s a fantastic boost for my research.”
He said that as the virus spreads, it mutates. The mutations get picked up in the genome sequence. However, as these are long strings of letters, a computer is needed to analyse the data.
“With the HPC on site, if we got a pandemic of influenza in Australia we could analyse in real time how it was spreading and this would inform how best to control it,” he said.
Other big data projects which Artemis will be used for include molecular biology, economics, mechanical engineering and oceanography.
The HPC service is free for University of Sydney researchers.
Artemis has 1512 cores of compute capacity and comprises 56 standard compute nodes, two high memory compute nodes and five GPU compute nodes.
Dell customised Artemis for the University of Sydney, using a technical design to meet its performance and capacity requirements.
Follow Hamish Barwick on Twitter: @HamishBarwick