Monday, 1 May 2017

Social media 'should be fined' for failure to tackle online hate

Social media companies should be fined for failing to remove illegal or harmful material, a committee of MPs has said.
The Home Affairs Select Committee insists many websites are "shamefully far" from tackling the issue, with some putting profits before safety.

There have been a number of high profile hate crimes broadcast on social media in recent years.

In January, a man with learning difficulties was bound, gagged and brutally punched in a video that was live streamed on Facebook.

Last month, a video was uploaded to Facebook of a man being shot dead in an unprovoked attack.

The Islamic State group has also used social media as a propaganda and recruiting tool.

Timi Ariyo, a student at Bristol University, told Sky News he was a victim of online racist abuse.

He said a friend had alerted him to a video posted on Snapchat and when he looked at it, he saw a group of 10 to 15 friends he had been to school with in a pub.

"They were chanting my name, making monkey noises and racial slur," he said.

"At that point, I realised it was beyond racial banter."

A group of MPs are calling on the Government to make it illegal for social media websites not to remove harmful material.

They want fines to be introduced as punishment and would also like companies to publish quarterly reports, outlining their safeguarding strategy.
Labour MP Yvette Cooper, who chairs the Home Affairs Select Committee, said: "Social media companies' failure to deal with illegal and dangerous material is a disgrace.

"They have been asked repeatedly to come up with better systems to remove illegal material... yet repeatedly they have failed to do so.

"It is shameful.

"These are among the biggest, richest and cleverest companies in the world."

There are fears that social media has become a platform for terrorist propaganda, child abuse and racist attacks.

Researchers at Cardiff University have been studying online hate crimes and their frequency. Using algorithms, they analyse particular groups of words that are often used together to cause harm.

Dr Pete Burnap said: "When we see spikes of hate, following trigger attacks, they tend to be responding to ongoing issues.

"So one of the examples would be the ongoing 'us and them' narrative. People write 'send them home', 'get them out', following attacks like Westminster, picking up on immigration and foreign policy."

SKY       News.