01-5313800 FREEPHONE 1800 844 104 info@colemanlegalpartners.ie

A former Facebook content moderator, who is currently taking a legal action against the social media platform, has revealed in an interview with the Irish Times the dreadful work he had to carry out as part of his position and the serious psychological trauma it inflicted upon him.

As recently as last week, speaking at Tech Won’t Build It and TU Dublin, Gray went into some detail in relation to the work duties he had to carry out, the goals that he had to achieve and the terrible content that he had to examine and decide whether or not it was fit for publication on the social media platform.

He said: “You’ll be scrolling through stuff like this and making decisions and then you get a truckload of very scared people being unloaded by men with machine guns somewhere in the Middle East. They’re lining them up and a trench has been dug in the ground. And you know what’s going to happen but you have to keep watching until the shooting starts and even after to ensure that you make the right decision. This stuff comes in and, bang, it pops up on my queue as quickly as that. I’m making decisions and it’s just, bang, and the next one, bang, and the next one.”

He continued: “Then you see something and you have a question because [after consulting the implementation standards] you’re not sure. So you look to the Known Questions document and that’s another 50,000 words of clarification. And there’s another document called Operational Guidelines, which is 5,000 to 6,000 words telling you how to do your job. Then there’s all the training material: 18 PowerPoint presentations with a wall of words on every slide. So can you imagine the cognitive load, the amount of stuff that has to be going on in your head to process this content?”

In a subsequent interview with the Irish Times, Gray said that there is no clarity on who devised the guidelines established to assist in allowing disturbing content to be published for public viewing. he said they were just provided with the guideline and told to always comply with them strictly. The most recent comment from Facebook on the issue was from Ellen Silver, vice president of operations in 2018. She said: “Our content policy team writes these rules, drawing on their expertise in everything from counterterrorism to child sexual exploitation – and in close consultation with experts around the world.”

Gray said that the guidelines were open to interpretation and it was not much of a reach to take what could be deemed the incorrect course of action in the eyes of management.

He said: “The decision making is super granular,” says Gray. “I’ve stated to my lawyers that there were about a hundred possible decisions to make on any given piece of content while I was there. I’ve just seen a news report saying that it’s now 250. If you make the right action for the wrong reason, eg you deleted it because there was a naked man in the image but that naked man was doing something which is also illegal. One of those actions is more important than the other and you have to choose the right one. Otherwise you get it wrong and then you’re in an argument with your auditor.”

Along with the high work targets there is pressure on the moderators to justify the decision to complete their actions in order to maintain their required 98% accuracy and contrasting pressure on the supervising auditors not to give in to appeals for the sake of their own positions. Gray said: “A typical sample that is audited would have been 200-250 tickets a month, so on average you might be flagged for one mistake a week. You spend the week trying to get those back. You’re not focused on your work, you’re focused on justifying your decisions.” He referred to a particularly disturbing photos of brutality against an infant and the debate that he was forced to have with his supervisor about why he deleted it.

He said: “(This is what) my job has taught me people are largely awful and I’m there behind my desk doing my best to save the world. But if no one tells the moderator what’s going on, they don’t know. Mark Zuckerberg, my boss [at the time], he has an opinion on what I think about this, but then he goes on to say that there’s a whole lot of us and we might have different experiences. Which is just a polite way of saying when you’ve got a lot of people, somebody’s going to be affected by this stuff.”

Gray was employed to complete the work for Facebook by CPL. CPL have recently amended their job description for the role that Gray filled to state “Candidates in this position must be able to deal with extreme, graphic and sensitive content”. Another company that provides moderators to Facebook, Accenture, requests their contractors to sign a document acknowledging that PTSD may occur due to the material viewed.

Gray was provided with access to a counsellor by CPL, via an outsourced company, but said there was never enough time to avail of this due to the work targets and workload. He said the attitude was “I haven’t got time for this. I’ve got targets to meet, I’ve got work to do. My boss is breathing down my neck. There’s 15,000 tickets in the high priority queue.”