Executives from Facebook, Google, YouTube and Twitter will meet with Prime Minister Scott Morrison on Tuesday, to discuss measures to curb ‘hate content’ on their platforms, it has been reported.
The Australian yesterday said the PM would be demanding the tech giants take greater responsibility for the content on their platforms, at a meeting in which would also be attended by Communications Minister Mitch Fifield and Attorney-General Christian Porter.
Australian ISPs were reported in the AFR to have also been invited to the meeting, which will take place in Brisbane. Earlier this week Telstra, Vodafone and Optus blocked a number of sites they said weren’t acting to remove footage of Friday’s terror attack in Christchurch.
The Office of the Prime Minister would not confirm to Computerworld if such a meeting was planned, or who would be attending. The Department of Communications and the Arts referred questions about the meeting to the PM's office.
The reports follow growing concerns from governments in Australia and New Zealand that social media companies aren’t doing enough to spread graphic content on their platforms.
Morrison on Monday wrote to the G20, calling on leaders to “ensure social media companies implement better safeguards to ensure their platforms can’t be exploited by terrorists or to spread hate speech”.
On Tuesday, New Zealand Prime Minister Jacinda Adern said her government would be scrutinising the role of social media in the incident.
“We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published. They are the publisher. Not just the postman. There cannot be a case of all profit, no responsibility,” she said.
According to Facebook, the livestream of the gunman’s headcam was viewed fewer than 200 times during the broadcast, and about 4000 times in total before being removed. But users shared and reuploaded the video to the platform: Facebook said it had removed 1.5 million videos of the attack globally, 1.2 million of them at upload.
Youtube said on Tuesday it had removed “tens of thousands of videos and terminated hundreds of accounts created to promote or glorify the shooter” and had been working “around the clock’ to rid the platform of the footage.
In the immediate aftermath, Youtube says it temporarily suspended the ability to sort or filter searches by upload date and diverted searches to authoritative news sources.
“There is much more work to do,” the company said.
On Sunrise yesterday, Morrison detailed his expectations of tech companies.
“I want the social media companies to use their technology to ensure that instantaneously, their platforms cannot be used as weapons by terrorists,” he told the program.
“If they can geo-target an ad to you based on something you’ve looked at on Facebook within half a second – it is almost like they are reading your mind – then I'm sure they have the technological capability to write algorithms that screen out this type of violent and hideous material at a moment’s notice,” Morrison said.
Communications Minister Mitch Fifield, in a statement to the AFR, said: “The time has come for those who own and manage platforms to accept a greater responsibility for how they are used. A best endeavours approach is no longer good enough.”