Facebook as a Social Media Platform

Facebook has been the pioneer of social media for years and has changed its terms of service to match the constantly changing social media landscape.

Throughout the years, Facebook has increased its social media presence and become a technological mogul. Although the app has brought entertainment and unity to millions of Facebook users, the platform has also become more controversial for its content regulation.

Some information that can be posted by users may end up being harmful information to certain audiences thus it is important to know the stance Facebook takes in regulating the process of content removal. In this article, we will go over Facebook’s community standards regarding violence and criminal behavior.

Facebook Rules on Harassment

Like all platforms, Facebook does not condone bullying or harassment. Facebook writes in its policy rationale that they “distinguish between public figures and private individuals because we want to allow discussion, which often includes critical commentary of people who are featured in the news or who have a large public audience. For public figures, we remove attacks that are severe as well as certain attacks where the public figure is directly tagged in the post or comment.”

Facebook applies different protocols for private individuals writing how they “remove content that’s meant to degrade or shame and have more heightened protection for users between the ages of 13 and 18.”

The list of actions that Facebook classifies as harassment include:

  • Repeated contact with someone in a manner that is unwanted or sexually harassing or directed at a large number of individuals
  • Targeting anyone maliciously by making statements with intent to bully or harass
  • Calling for self-injury or suicide of a specific person, or group of people
  • Attacks through derogatory terms related to sexual activity
  • Posting content about a violent tragedy, or victims of violent tragedies
  • Acting/pretending to be a victim of an event
  • Threatening to release an individual’s private phone number, residential address, or email address
  • Creating Pages or Groups that are dedicated to attacking individuals

Facebook Harassment Controversy

Facebook has faced its fair share of scrutiny when it comes to how they deal with harassment. State Attorney Generals from multiple states wrote in a letter they regularly encountered people facing online intimidation and harassment on Facebook and suggested allowing third-party audits of hate content as a possible solution. According to a 2017 survey from the Pew Research Center, more than 40 percent of Americans have experienced some form of online harassment, more than three-quarters were reported from Facebook.

Facebook had removed one of the largest public groups, Official QAnon, a conspiracy theory group for violating the company’s policies on misinformation, bullying, hate speech, and harassment.

Facebook’s Rules on Intellectual Property

Facebook has rules on intellectual property rights. Under Facebook’s Terms of Service and Community Standards, you can only post content to Facebook that doesn’t violate someone else’s intellectual property rights. The platforms advise the best way to make sure that what users post to Facebook doesn’t violate copyright law is to post content that one has created themselves.

On Facebook it’s possible to infringe on copyright rules even when you post content:

  • Bought or downloaded the content
  • Recorded the content onto your own recording device
  • Gave credit to the copyright owner
  • Included a disclaimer that you didn’t intend to infringe copyright
  • Modified the work or added your own original material to it
  • Found the content available on the internet
  • Think the use is a fair use

Users own all of the content and information that they post on Facebook and control how it is shared through Facebook’s privacy and application settings. Facebook’s terms of service do not allow people to post content that violates someone else’s intellectual property rights, including copyright and trademark. When Facebook receives a report from a rights holder or an authorized representative, they will remove or restrict content that engages in copyright infringement or trademark infringement.

If Facebook determines that a user has breached their terms of service they have the right to suspend or permanently disable user access. Facebook can also suspend or disable an account that repeatedly infringes other people’s intellectual property rights. When this happens Facebook will inform the user and explain any options to request a review of the issue.

Copyright Claiming Photos/Videos On Facebook

Facebook has given certain partners the power to claim ownership over images and then moderate where those images show up across the Facebook platform. Dave Axelgard, product manager of the creator and publisher experience at Facebook said  “We want to make sure that we understand the use case very, very well from that set of trusted partners before we expand it out because, as you can imagine, a tool like this is a pretty sensitive one and a pretty powerful one, and we want to make sure that we have guardrails in place to ensure that people are able to use it safely and properly.”

To claim their copyright, a rights holder uploads a file to Facebook’s Rights Manager that contains all the image’s metadata specifies where the copyright applies and can leave certain territories out. After a rights holder does this Facebook will process the image and monitor where it shows up.

Notable Facebook Copyright Situations:

Trump Gets Copy Strike

Like other social media platforms, Facebook has removed videos posted on President Trump’s account after receiving a copyright complaint from the owners. One specific video which featured two toddlers, which was turned into an internet meme and used by the Trump campaign was taken down after one of the children’s parents lodged a copyright claim. It had more than four million views by the time Facebook removed it. Jukin Media, a third-party company that often acquires the rights from people to viral videos, said that “neither the video owner nor Jukin Media gave the President permission to post the video, and after our review, we believe that his unauthorized usage of the content is a clear example of copyright infringement without valid fair use or other defense.”

Facebook Gaming and Copyrighted Music

Facebook Gaming does allow its partnered streamers to play copyrighted, popular music in the background of their live streams. The rules allow for music to be played only as an addition to live streams as opposed to the focus of a stream. A spokesperson for Facebook Gaming said “Music played during a gaming broadcast must be a background element, not be the primary focus of the stream. A streamer’s voice and/or gameplay audio should be in the foreground. This also applies to clips made from a live stream, and the VOD version of live streams, but does not extend to separately edited and uploaded VOD content.”

Violence and Criminal Activity

If a third party user wanted content removed due to the content expressing clearly violent or criminal intent in general, they can request for Facebook to take down the content. A lot of things can be posted on Facebook but according to the platform’s guidelines, Facebook removes “language that incites or facilitates serious violence.” Their guidelines also say “We remove content, disable accounts, and work with law enforcement when we believe there is genuine risk of physical harm or direct threats to public safety.” Facebook also takes into consideration the manner and context in which the content is expressed before making a decision. This can be done through user reports and also Facebook content reviewers.

Hate Groups or Individuals

One may also want an individual or a group’s entire series of Facebook content to be removed due to the real-life hate or violence they provoke. Facebook states in its guidelines that they do not allow users who proclaim a violent mission to have a presence on their platform. Some specific negative activity that Facebook outlines is “terrorist activity”, “organized hate”, “mass murder”, “human trafficking”, or “organized violence.” In a nutshell, this part of the Facebook community standards show that Facebook does not tolerate individual users and groups that are associated with the activity mentioned above.

Unauthorized Drugs and Other Goods

A third party user may also want content promoting or selling unauthorized drugs to be removed from Facebook. Facebook does not tolerate this type of activity according to their guidelines and they will take down posts from “individuals, manufacturers, and retailers” who are attempting to “purchase, sell, or trade non-medical drugs, pharmaceutical drugs, and marijuana.” This also applies to firearms, firearm parts, ammunition and other objects related to firearms that are being bought and sold through the platform. This also applies to other potentially dangerous items as well.

Fraudulent Activity

Finally, a user may want content to be removed if an individual is attempting to use the platform to impersonate another individual for their own gain or at the detriment of the individual being impersonated. More specifically, Facebook highlights that it does not tolerate individuals who conduct fraudulent activity, especially when it is to “deceive people”, “gain an unfair advantage”, or “deprive another of money, property, or legal right.” This is important in protecting the integrity of the general public and also to shield users from malicious attempts at deception.

There are many more reasons why a third party user may want content removed from Facebook. This is only an introduction to the Facebook community standards. There are additional sections describing where Facebook stands on content regarding safety, fake news, integrity and authenticity, intellectual property, and more on their platform.

Leave a comment