However, he responded, "But I just don't agree that that is the experience that most people want and that's not the experience we are trying to deliver".
While moderators have the option to delete, mark as disturbing, or ignore content when reviewing it, the reporter discovered that content flagged as disturbing remains on the site but limits who can view it.
The trainer tells moderators this would be left untouched by moderators because "it implies a lot, but you have to jump through a lot of hoops to get there".
"It's the really extreme, really unsafe form of content that attracts the most highly engaged people on the platform".
Chief operating officer Sheryl Sandberg said in January that Facebook had to do better to stem the spread of hate speech and attempts to manipulate voters via the social network. "It's all about money at the end of the day".
However, the group's chief executive Lorraine Higgins said today: "Based on the revelations from Channel 4 Dispatches documentary last evening, we have chose to suspend our partnership with Facebook until further notice".
At Cpl Resources, the company used by Facebook for its moderators since 2010, the undercover reporter is repeatedly told that graphic and violent content is allowed to remain on the site - as long as those posting it do not include a description that endorses the image. Facebook did eventually ban the group in March.
A journalist from British broadcaster Channel 4 went undercover as a Facebook moderator and found a stream of toxic content that the company was failing to delete.
Being "shielded" means a page is under Facebook's "shielded review", according to the TV report. It gives popular pages protected status, meaning people who work directly for Facebook decide what to do with them rather than third-party contractors. "Don't worry too much about deleting their stuff because those pages are shielded so if you delete a video or whatever, you haven't deleted Tommy Robinson's video". This is a discussion about political speech. Creating a safe environment where people from all over the world can share and connect is core to Facebook's long-term success. "And that debate can be entirely legitimate", said Allan. When pressed about whether it constituted hate speech, he said it's "right on that line".
Despite the statements from Facebook, other questionable content remain on the platform, as shown in the documentary, including a boy being beaten by a grown man; a meme that shows a girl whose head is dipped underwater with the caption "when your daughter's first crush is a little negro boy"; and a comment aimed at Muslim immigrants, saying, "f**k off back to your own countries".
In response to the documentary, which airs tonight at 9 PM in the UK, Facebook admitted in a post from Global Policy Management VP Monika Bickert that it made "mistakes", and that "we have been investigating exactly what happened so we can prevent these issues from happening again".