The Decider [again]

I wrote a post in May  about how Facebook is the new decider about what is “acceptable” content online.

I wrote then:

I do not like the idea of Facebook being the arbiter of what is and is not in good taste and what users should and should not see (for more read the Filter Bubble) and it either needs to be a neutral platform open to all users and only removing the most egregious instances of specific violence, or make clear that it wants to be a happy place of flowers and butterflies.

and

Facebook has no legal or moral obligation to admit anybody onto the site to say whatever he or she pleases, other than its founder Mark Zuckerberg’s goal of “making the world more open and connected.”  Something that is hard to do when deliberately closing off large portions of the worlds largest social network to those who possess unpopular views.

But this week, Facebook reversed its position on one of its more controversial content filters, beheadings. According to news reports,[http://www.bbc.co.uk/news/technology-24608499] the social network removed the content a few months after complaints, but those same reports now say that Facebook is allowing beheading videos once again.

The company’s justification: people share their experiences about ongoing events on Facebook and the company is the platform for sharing these events. If people are sharing graphic content to celebrate it then Facebook will remove the material.

The company provided the below statement to CNET:

People turn to Facebook to share their experiences and to raise awareness about issues important to them. Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses, acts of terrorism, and other violence. When people share this type of graphic content, it is often to condemn it. If it is being shared for sadistic pleasure or to celebrate violence, Facebook removes it.

As part of our effort to combat the glorification of violence on Facebook, we are strengthening the enforcement of our policies. First, when we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video, and will remove content that celebrates violence.

Second, we will consider whether the person posting the content is sharing it responsibly, such as accompanying the video or image with a warning and sharing it with an age-appropriate audience.

Based on these enhanced standards, we have re-examined recent reports of graphic content and have concluded that this content improperly and irresponsibly glorifies violence. For this reason, we have removed it.

Going forward, we ask that people who share graphic content for the purpose of condemning it do so in a responsible manner, carefully selecting their audience and warning them about the nature of the content so they can make an informed choice about it.

So raising awareness of issues involving graphic content: good. Celebrating graphic content: bad.