Tuesday, 5 December 2017

Google says 10,000 staff will moderate content on platforms including YouTube next year



Google says it will have more than 10,000 members of staff monitoring content on platforms including YouTube next year.
 
That number is composed of all teams across Google, including not only the reviewers but the company's engineers but also its lawyers and operations teams too.
 
 
YouTube has been criticised for failing to adequately safeguard children and for allowing extremist content, including Islamic terror-related and white supremacist videos, to be shared.


Hundreds of accounts which had posted lewd comments beneath benign videos, such as content children had uploaded of themselves performing gymnastics, have also been suspended.


Although Google, which is YouTube's parent company, employs machine learning algorithms to automatically flag videos which may breach its rules, the ultimate decision to remove content is made by humans.


In a statement from YouTube's chief executive, Susan Wojcicki, the company claimed to have reviewed almost two million videos and removed 150,000 since June.



BBC          News.