How Facebook handles inappropriate content
Ensuring Facebook's community of more than 900 million users abides bythe company's user policies is a task that requires hundreds ofemployees.
Ensuring Facebook's community of more than 900 million users abides by the company's user policies is a task that requires hundreds of employees.
Based in Menlo Park, Calif.; Austin, Texas; Dublin, Ireland and Hyderabad, India, these employees field user reports of inappropriate posts around the clock. Yesterday, Facebook revealed details about how they get the job done.
Reports of inappropriate content, which users can submit with just a couple of clicks, are directed to one of four support teams.
An Abusive Content Team handles spam and sexually explicit content. Meanwhile, a Safety Team handles threats of vandalism, graphic violence, credible threats of violence and illegal drug use. A Hate and Harassment Team handles, well, reports of hate speech and harassment. The team that handles hacked and imposter accounts is called the Access Team.
If found to be in violation of Facebook's policies, Statement of Rights and Responsibilities or Community Standards, the content is removed and its publisher warned. Facebook's support teams may also block users who post inappropriate content or ban them from specific features. A separate team handles appeals.
Sometimes content on Facebook violates not just the company's policies, but the law.
One law enforcement agency, for instance, discovered photos of a man siphoning gas from a police car on the site. Others have discovered stolen property, calls for help and even live-crime updates on the social network.
These crime reports haven't necessarily originated from the company's team, but Facebook does say it will share reports with law enforcement "when we have a good faith belief it is necessary to prevent fraud or other illegal activity, to prevent imminent bodily harm, or to protect ourselves and you from people violating our Statement of Rights and Responsibilities."