There are more than 250,000 health apps on the market, but a lack of clear regulation and developers’ ignorance of existing rules, could mean they are putting users at risk.
The sheer number of apps – and the widespread perception they provide cost-effective, accessible healthcare – has given rise to consumers receiving delayed or unnecessary diagnoses, being recommended inappropriate treatment, getting misled into making purchases, as well as a raft of privacy issues.
Researchers at the University of Sydney are hoping to minimise such apps’ potential for harm with a design guide aimed at developers, to help them meet their regulatory requirements and follow emerging best practices.
“Health app use could be harmful to consumers. For example, consumers may suffer loss of personal privacy, leading to reputational damage with subsequent financial, social, insurance or employment discrimination. Symptoms of ill-health may be exacerbated, under-diagnosed, or over-diagnosed,” said senior author Dr Quinn Grundy from the university’s Charles Perkins Centre.
“It is important for those involved in health app development, regardless of whether they focus on the commercial, technical or health-related aspects, to be aware of the potential for negative outcomes, and to work towards producing apps that deliver benefits but also avoid, or at least minimise, possible harms.”
The result is an interactive tool – in the form of a questionnaire – which brings together 29 policies produced by both governments and non-government organisations in Australia.
Created in partnership with the Australian Communications Consumer Action Network, the tool highlights legislative, industry and professional standards around: privacy, security, content, promotion and advertising, consumer finances, medical device efficacy and safety, and professional ethics.
For the purposes of forming the guidelines, the researchers focused on mental health apps.
“We chose to focus on mental health apps because digital mental health tools, including apps, are being heavily promoted by governments and the World Health Organization yet are associated with a risk of harm, including a heightened risk of stigma or discrimination associated with loss of privacy, and potential risks to consumer safety from under-treatment,” lead author Lisa Parker writes in the research study behind the guide, published in BMC Medical Informatics and Decision Making.
“People download mental health apps because they are experiencing some sort of distress. This means that they might be vulnerable to misleading advertising, stigma due to loss of privacy, loss or waste of their money, or worsening of mental health symptoms that the app may not address.”
A separate study, published last month, found mental health apps significantly reduced people's depressive symptoms, however, didn’t consider their design principles.
While there is legislation to cover the design of health related apps in Australia, and government bodies such as the Office of the Australian Information Commissioner (OAIC) have published best practice guides for developers, the researchers found them hard to find and difficult to comprehend.
“Legislation is siloed in separate government departments. We note above that many laws depend on self-declared compliance and consumer complaints, and that some laws are poorly expressed to facilitate this,” Parker and her colleagues write.
The “fragmented nature” of the regulatory oversight caused further problems, Parker et al added.
“For example, consumers who experience worsening symptoms associated with the use of an app to manage stress may wonder whether complaints and concerns should be directed at consumer protection agencies, medical device agencies or app stores. This means that illegal and potentially harmful apps may persist in the public domain,” the study states.
While the guide is targeted at Australian developers, it could be adapted for other countries, the researchers said.