Facebook has launched a fresh drive to root out race hate, extremism and fake news from its platform.
The social media giant has published new guidelines on the standards it expects from its users and has urged people to "share responsibly".
In an exclusive interview with Sky News, one of Facebook's top executives has admitted the company's enforcement of policy violations is not perfect but insists the company has the interests of its users at heart.
Siobhan Cummiskey, Facebook's head of policy for Europe, the Middle East and Africa said: "We absolutely do have the interests of the community of people who use Facebook at heart, and safety is really important to us, and that's really why we are publishing this new set of community standards."
:: Tech firms 'invest £100m' in fight against terror propaganda
Facebook is also publishing, for the first time, a copy of the guidelines its 7,500 content reviewers use to help weigh up if a post violates the company's policies and should be removed.
Ms Cummiskey said Facebook was in the process of recruiting more people to help review content.
"It's very important that we use a combination of technology, human reviewers and the flagging of problem content in order to remove posts that violate our community standards," she said.
"We use automation in order to route the various reports that we receive every week. We receive millions of reports every week, and automation helps us to route those reports to the right content reviewer.
"Technology is therefore really helping us here. In the context of child exploitation imagery, we use technology in order to stop the re-upload of known child exploitation images.
"Technology is also helping to counter terrorism. Ninety-nine percent of terrorist content is removed before it is ever flagged by our community of users."
:: Amber Rudd plans tougher laws for terrorists using internet
The new community standards, posted on Facebook's platform from Tuesday, highlight the firm's determination to act on unacceptable content, but is also an admission the organisation needs to improve
Facebook states: "Our policies are only as good as the strength and accuracy of our enforcement and our enforcement isn't perfect. We make mistakes because our processes involve people and people are not infallible."
Facebook has been criticised heavily in recent weeks, with the company's founder Mark Zuckerberg forced to apologise to US politicians over the Cambridge Analytica data mining scandal.
The company has also been accused by a number of governments of not doing enough to tackle terrorist material online.
:: Sky Views: Facebook's fake news threatens democracy
On Sunday, Health Secretary Jeremy Hunt gave the company until the end of this month to come up with a more robust system for protecting children on Facebook.
Ms Cummiskey said she welcomed the opportunity to work with Mr Hunt and the UK government more generally in improving safety.
More from Facebook
"Safety is extremely important to us and I think we're only getting better," she said.
"There's always more that we can be doing in this space and that's really what today is all about. It's about being very clear that harmful content has no place on Facebook at the same time we are trying to create a platform for all ideas."
[contf] [contfnew]
Sky News
[contfnewc] [contfnewc]