‘Biometric Mirror’ reflects AI’s worrying potential

'You seem irresponsible. I'm telling your insurer'

Researchers at the University of Melbourne have designed a mirror which uses AI to detect an individual’s character traits based solely on their face. And it’s powerful enough to detect their levels of happiness, introversion and aggressiveness.

At least that’s how it is presented. What the mirror really does is compare an image of the onlookers face against a database of other faces which have been assessed with crowd-sourcing on 14 characteristics.

“As such the algorithm is correct, it’s accurate. But the information it feeds back is not accurate because it’s based on subjective information,” explains Dr Niels Wouters from the university’s Microsoft Research Centre for Social Natural User Interfaces (SocialNUI).

The longer a person stands in front of the mirror, the more personal the traits become; beginning with gender, age and ethnicity, eventually giving a score for ‘weirdness’ and emotional stability.

The real research is in exploring their reactions to the system.

“The aim is to investigate the attitudes that emerge as people are presented with different perspectives on their own, anonymised biometric data distinguished from a single photograph of their face. It sheds light on the specific data that people oppose and approve, the sentiments it evokes, and the underlying reasoning,” Wouters says.

At the end of their assessment, users are served with a scenario. For example:

You look 34 years old and attractive. But you also seem very emotionally unstable. Imagine that I send this to your employer and they decide to exclude you from management opportunities.

“It shows users how easy it is to implement AI that discriminates in unethical or problematic ways which could have societal consequences. By encouraging debate on privacy and mass-surveillance, we hope to contribute to a better understanding of the ethics behind AI,” Wouters added.

Although the Biometric Mirror is more a thought-provoker than useful tool – the project has been done in collaboration with Science Gallery Melbourne and it will form part of an exhibition there later this year – similar systems are already in use in retail and advertising settings.

In 2016 Val Morgan Outdoor unveiled “Australia and New Zealand’s most intelligent out of home audience measurement system” which captures people’s faces and uses machine learning to predict age and gender “and can even report on viewers moods”.

The ‘Smartscreen Network’ throughout Westfield shopping centres use cameras to capture the age, gender and ‘mood’ of shoppers.

UK supermarket giant Tesco in 2013 embarked on a trial that purported to use facial recognition to capture demographic data to customise digital display advertising at point-of-sale terminals. It was later dropped due to customer backlash.

As well as being able to determine age, gender and race from footage and images, there is increasing concern that AI can be used to fairly accurately estimate an individual’s sexuality and even political leaning.

The risks and social impact of advancing technology is the subject of a three-year project by the Australian Human Rights Commission which launched yesterday.

“While collecting personal information about your shopping preferences to tailor an individual service may seem harmless, capturing this information without consent makes it impossible to know if a prediction is based on correct data,” Wouters said.

“The use of AI is a slippery slope that extends beyond the realm of shopping and advertising. Imagine having no control over an algorithm that wrongfully considers you unfit for management positions, ineligible for university degrees, or shares your photo publicly without your consent,” he added.

The Biometric Mirror will be part of the Perfection exhibition at Science Gallery Melbourne from September 12 to November 3.

 

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags privacyMicrosoftMelbourneuniversity of melbournefacial recognitionAIAustralian Human Rights CommissionScience Gallery MelbourneCentre for Social Natural User InterfacesSocialNUI

More about AustraliaAustralian Human Rights CommissionMicrosoftMorgannews.com.auTescoUniversity of Melbourne

Show Comments
[]