Social media firms agree to work with UK charities to set online harm boundaries
Social media giants, including Facebook-owned Instagram, have agreed to financially contribute to UK charities to fund them making recommendations that the government hopes will speed up decisions about removing content that promotes suicide/self-harm or eating disorders on their platforms.
The development follows the latest intervention by health secretary Matt Hancock, who met with representatives from the Facebook, Instagram, Twitter, Pinterest, Google and others yesterday to discuss what they’re doing to tackle a range of online harms.
“Social media companies have a duty of care to people on their sites. Just because they’re global doesn’t mean they can be irresponsible,” he said today.
“We must do everything we can to keep our children safe online so I’m pleased to update the house that as a result of yesterday’s summit, the leading global social media companies have agreed to work with experts… to speed up the identification and removal of suicide and self-harm content and create greater protections online.”
However he failed to get any new commitments from the companies to do more to tackle anti-vaccination misinformation — despite saying last week that he would be heavily leaning on the tech giants to remove anti-vaccination misinformation, warning it posed a serious risk to public health.
Giving an update on his latest social media moot in parliament this afternoon, Hancock said the companies had agreed to do more to address a range of online harms — while emphasizing there’s more for them to do, including addressing anti-vaccination misinformation.
“The rise of social media now makes it easier to spread lies about vaccination so there is a special responsibility on the social media companies to act,” he said, noting that coverage for the measles, mumps and rubella vaccination in England decreased for the fourth year in a row last year — dropping to 91%.
There has been a rise in confirmed measles cases from 259 to 966 over the same period, he added.
With no sign of an agreement from the companies to take tougher action on anti-vaccination misinformation, Hancock was left to repeat their preferred talking point to MPs, segwaying into suggesting social media has the potential to be a “great force for good” on the vaccination front — i.e. if it “can help us to promote positive messages” about the public health value of vaccines.
For the two other online harm areas of focus, suicide/self-harm content and eating disorders, suicide support charity Samaritans and eating disorder charity Beat were named as the two U.K. organizations that would be working with the social media platforms to make recommendations for when content should and should not be taken down.
“[Social media firms will] not only financially support the Samaritans to do the work but crucially Samaritans’ suicide prevention experts will determine what is harmful and dangerous content, and the social media platforms committed to either remove it or prevent others from seeing it and help vulnerable people get the positive support they need,” said Hancock.
“This partnership marks for the first time globally a collective commitment to act, to build knowledge through research and insights — and to implement real changes that will ultimately save lives,” he added.
The Telegraph reports that the value of the financial contribution from the social media platforms to the Samaritans for the work will be “hundreds of thousands” of pounds. And during questions in parliament MPs pointed out the amount pledged is tiny vs the massive profits commanded by the companies. Hancock responded that it was what the Samaritans had asked for to do the work, adding: “Of course I’d be prepared to go and ask for more if more is needed.”
The minister was also pressed from the opposition benches on the timeline for results from the social media companies on tackling “the harm and dangerous fake news they host”.
“We’ve already seen some progress,” he responded — flagging a policy change announced by Instagram and Facebook back in February, following a public outcry after a report about a UK schoolgirl whose family said she killed herself after being exposed to graphic self-harm content on Instagram.
“It’s very important that we keep the pace up,” he added, saying he’ll be holding another meeting with the companies in two months to see what progress has been made.
“We’ll expect… that we’ll see further action from the social media companies. That we will have made progress in the Samaritans being able to define more clearly what the boundary is between harmful content and content which isn’t harmful.
“In each of these areas about removing harms online the challenge is to create the right boundary in the appropriate place… so that the social media companies don’t have to define what is and isn’t socially acceptable. But rather we as society do.”
In a statement following the meeting with Hancock, a spokesperson for Facebook and Instagram said: “We fully support the new initiative from the government and the Samaritans, and look forward to our ongoing work with industry to find more ways to keep people safe online.”
The company also noted that it’s been working with expert organisations, including the Samaritans, for “many years to find more ways to do that” — suggesting it’s quite comfortable playing the familiar political game of ‘more of the same’.
That said, the UK government has made tackling online harms a stated policy priority — publishing a proposal for a regulatory framework intended to address a range of content risks earlier this month, when it also kicked off a 12-week public consultation.
Though there’s clearly a long road ahead to agree a law that’s enforceable, let alone effective.
Hancock resisted providing MPs with any timeline for progress on the planned legislation — telling parliament “we want to genuinely consult widely”.
“This isn’t really issue of party politics. It’s a matter of getting it right so that society decides on how we should govern the Internet, rather than the big Internet companies making those decisions for themselves,” he added.
The minister was also asked by the shadow health secretary, Jonathan Ashworth, to guarantee that the legislation will include provision for criminal sentences for executives for serious breaches of their duty of care. But Hancock failed to respond to the question.
from TechCrunch https://tcrn.ch/2VBstku
No comments: