What's offensive? Facebook's new standards clarify what you can post
When does hate speech or nudity become unacceptable? With updates made to Facebook's Community Standards page, users may be more certain about whether the content they are posting is "appropriate."
The changes attempt to clarify what Facebook considers hate speech, bullying, promotion of suicide, criminal activity, sexual violence and exploitation, nudity, and graphic content.
Facebook users have always been able to report content that they find offensive, harmful, or illegal. However, the previous Community Standards page gave little detail as to what exactly qualified as offensive material. Now, the page has been revamped to include four categories of standards:
1. Keeping you safe
2. Encouraging respectful behavior
3. Keeping your account and personal information secure
4. Protecting your intellectual property
Facebook’s clarifications come at a time where what you share and who you share it with can cross personal, and sometimes legal, boundaries. Websites are finding it challenging to regulate usage on their sites in a way that does not infringe on users’ freedom of speech while ensuring other users are not negatively impacted by the freedom of others.
According to the company, the updates to the Community Standards page are more cosmetic than transformative. They also refer to how users are expected to behave, and are separate from Facebook’s legal terms.
"We're not changing anything about the policies," said Monika Bickert, Facebook's head of global policy management, to the Washington Post. "We're just trying to explain what we do more clearly."
Some of the standards are in place to protect users’ security and identity, such as their account information and intellectual property. Others, however, provide guidelines that solidify Facebook’s approach to potentially offensive material.
In the category of “Keeping you safe,” Facebook explains its approach to various safety matters, such as direct threats, bullying, harassment, criminal activity, attacks on public figures, and sexual exploitation. It also clearly explains its intolerance for “Dangerous Organizations,” stating it will “remove content that expresses support for groups that are involved in the violent or criminal behavior” of terrorist activity or organized criminal activity. The category is also used to address self-injury, part of Facebook’s new attempt to provide suicide prevention services.
Facebook also ventures into the realm of regulating respectful behavior and laying down guidelines for nudity and graphic content. The website explains that efforts to limit nudity are out of consideration for others. While Facebook understands that it is not always explicitly inappropriate, policies are necessary in order to efficiently respond to reports of offensive material.
“People sometimes share content containing nudity for reasons like awareness campaigns or artistic projects,” Facebook states on its Community Standards page. “We restrict the display of nudity because some audiences within our global community may be sensitive to this type of content – particularly because of their cultural background or age.”
Google also tried to address the use of nudity and pornographic material on its blogging platform, but had to rescind a partial ban after users complained about the infringement, which would have gone into effect after March 23. Users would have had the option to make their blogs containing graphic nudity private, or risk Google taking them down. Google intended to allow some nudity, as long as it offered a “substantial public benefit” either artistically, scientifically, or educationally, but pushback from users forced the website to revert to its old policies.
“We’ve had a ton of feedback, in particular about the introduction of a retroactive change (some people have had accounts for 10+ years), but also about the negative impact on individuals who post sexually explicit content to express their identities,” Jessica Pelegio, a Google social product support manager, wrote online.
The Family Online Safety Institute, one of five organizations that make up Facebook’s safety advisory board, welcomed the latest changes.
"I think it's great that Facebook has revamped its community standards page to make it both more readable and accessible," FOSI chief executive Stephen Balkam told the BBC. "I wish more social media sites and apps would follow suit."