Will Privacy Act changes have a chilling effect on cyber security research?

Security through obscurity doesn’t work, Melbourne Uni researchers note

Attorney-General George Brandis has yet to reveal the details of amendments he will seek to make to the Privacy Act that will criminalise the re-identification of datasets released by government departments and agencies.

Brandis announced yesterday that the government would make it an offence “to counsel, procure, facilitate, or encourage anyone to do this, and to publish or communicate any re-identified dataset.”

His announcement came ahead of the Department of Health revealing today that elements of some datasets it released last month could potentially be re-identified.

The 1 August release included Medicare data from 1984 and PBS data from 2003 to 2014. In total, it included data relating to some 3 million Australians and services provided by doctors, pathologists, diagnostic imaging and allied health professionals as well as details of subsidised scripts.

A team of Melbourne University researchers successfully re-identified service provider ID numbers.

The researchers notified the department on 12 September. The department immediately pulled the dataset from the data.gov.au portal and launched an investigation into the incident, including engaging with the researchers in an attempt to understand the flaws in the de-identification process. The researchers have praised the department for its response.

Although noting that the details of the Privacy Act amendments are not yet public, advocacy organisation Digital Rights Watch has expressed alarm over Brandis’ comments.

A statement from its chair, Tim Singleton Norton, said: “The specific wording of ‘counsel, procure, facilitate or encourage’ will need to be framed carefully to exclude innocent acts, such as rigorous penetration testing of encryption software. Likewise, the whole area of research into de-identification research, such as that undertaken by the CSIRO, could be jeopardised through heavy-handed legislation.”

“It’s a good thing to encourage Australian cyber security researchers, because what we’re talking about are mathematical facts about whether something was secure or not,” Dr Vanessa Teague, a senior lecturer at the University of Melbourne’s Department of Computing and Information Systems told Computerworld Australia.

Teague was one of the researchers who unearthed the problems with the dataset (she also previously discovered a vulnerability in NSW’s iVote platform).

“It’s a mathematical fact that in this particular case there was a problem,” Teague said. “And it’s better to encourage Australian researchers to find out those mathematical facts and tell the government and the public so that we can improve techniques in the future, because nasty people who take advantage of vulnerabilities are not going to be any less likely to do so as a result of Australian privacy law.”

In their article, the researchers said it was a positive move by the department to reveal alongside the dataset the details of the work it had done to protect the data.

“Security through obscurity doesn’t work – keeping the algorithm secret wouldn’t have made the encryption secure, it just would have taken longer for security researchers to identify the problem. It is much better for such problems to be found and addressed than to remain unnoticed,” the article by Dr Chris Culnane, Dr Benjamin Rubinstein and Teague said.

The trio called for details about protections for sensitive datasets to be released well in advance of the datasets themselves in order for their effectiveness to be assessed by researchers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags privacy

More about Attorney-GeneralCSIRODepartment of HealthNortonUniversity of Melbourne

Show Comments
[]