Last May, Mark Zuckerberg’s company forbade the dissemination of videos with violent images after it received numerous complaints from users about a video of a beheading shared on its network. Alleging psychological damage, Facebook updated its privacy policy; and now it has done so again. The company will allow the publishing of videos with violent content provided that users try to express their disgust or condemn such actions.
Facebook has made some changes to its community rules regarding human rights abuses, terrorism, and other types of violence. As explained by the social network’s spokesperson, videos will not be deleted for including violent images. Instead, the company will leave it to users to decide if they will or will not view such content, since they will be advised with a warning message that they are about to access sensitive images.
The company’s policies haven’t actually changed: “disagreeable” videos will be permitted for purposes of condemnation but not for sadism, in which case they will continue to be taken down. The difference is that, from now on, Facebook will take into account both the content of videos shared on its network and the context in which they are spread.
The spokesperson has specified that “Facebook has long been a place where people turn to share their experiences, particularly when they’re connected to controversial events on the ground, such as human rights abuses, acts of terrorism and other violent events. People share videos of these events on Facebook to condemn them.”
When you condemn certain content on the social network, there is no guarantee that it will be removed. Thus, Facebook has defended itself by saying that within a highly diverse community of users, shared content does not always infringe community norms even if several people consider the post to be inappropriate.
Likewise, users will also be warned beforehand: the social network aims to offer users the option to control what they see on Facebook, allowing them to hide people, pages, or apps they consider offensive.
The video showing a woman being decapitated reappeared on Facebook after the change to community rules on Tuesday, but the social network eliminated it again within only 24 hours. In this case, Facebook has clarified in a statement that its aim is to strengthen its fights against the glorification of violence, but it will continue to analyze the context of other multimedia files shared on the network to eliminate those that “celebrate violence.”
Download Facebook on Uptodown
Android version | http://facebook.en.uptodown.com/android
iPhone version | http://facebook.en.uptodown.com/iphone