Facebook will not allow any violent videos anymore. Facebook Inc. is hiring 3000 technical experts to monitor whether any individual is posting any violent videos like suicides, killings or any other such violent content. The reason why it became so worried about screening the violent content is, recently two videos went viral. One of them was a violent kill, and another one was a suicide.
After the two videos had gone viral, the social media giants faced a lot of calls to take serious action and respond faster. The new employees will join the already existing 4500 employees for Facebook content moderation. A day might come when computers or AI can screen out the violent videos before they are posted. But for now, the human touch will be just fine.
Mark Zuckerberg wrote, “If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner whether that’s responding quickly when someone needs help or taking a post down.”
Facebook took a stance this time
Facebook not only wants to screen out the violent videos, but it also wants to make people feel safe. In a letter earlier this year, Zuckerberg already said, Facebook will take the responsibility of its impact on its 2 billion users. Even if there is a spread of misinformation of a civic engagement due to posts/memes/videos in Facebook, the company will take the responsibility. Now, this seems to be better than the social media giants previous stance where it took a neutral ground.
Last week, a US individual who was dubbed as the “Facebook Killer” had shot himself to death. The whole suicide video created outrages throughout the world. The social media giant had to remove a lot of footage hours after the attack. And Zuckerberg acknowledged they have a certain role to play in spreading such brutal videos. The company is already working to make things easier to report so that they can take action sooner. The 3000 new tech specialists will help in removing hate speech and child exploitation related content.
Companies like Google and YouTube are facing the same dilemma
According to critics, the social media platforms are too slow to respond. Clearly, Zuckerberg doesn’t want Facebook to become like that. The company is already working on a better technology which can identify inappropriate content. Right now it is not clear how they are going to use the new monitoring options. The company also made it clear that it is not a media organisation to spread the news. Last month they also enhanced their security to reduce efforts to spread misinformation. They also put a complete ban on revenge porn.
Well, under such circumstances, Facebook is not the only company that is in a dilemma. YouTube is also facing the same situation. Google has already hired a team of 10000 active workers to work on monitoring and screening out hate/violent/abusive videos. The addition of 3000 workers to monitor videos will definitely help the company in stopping objectionable content, but it’s not a complete fail-safe.