Towards the end of 2019, professional services vendor Cognizant announced that it is discontinuing its role as a content moderation provider.
The announcement came after online media outlet The Verge conducted two investigations into working conditions at sites dedicated to Facebook. At the time of the announcement Cognizant was employing thousands of moderators around the world to remove hate speech, terrorism, and other inappropriate material from platforms including Facebook, Google, and Twitter.
Cognizant will shut their offices in Phoenix and Tampa sites from March 1, 2020. The company confirmed that individuals who lose their jobs would be given “retention bonuses, severance packages, and the opportunity to participate in various reskilling programs.” There has been no confirmation if they will be offered other jobs within Cognizant.
Due to the decision Facebook has announced that it will increase the number of moderators it has working at a site in Texas, which is operated by Genpact, to address to the shortfall in moderators created by the Cognizant exit from the market.
A Facebook spokesperson said: “One of the reasons we work with partners is to be able to make adjustments quickly to ensure Facebook remains safe for people. Cognizant’s content reviewers have played a valuable role in keeping our platforms safe for people all over the world and we thank them for the work they’ve done and continue to do.”
Cognizant released a statement which said: “We have determined that certain content work in our Digital Operations practice is not in line with our strategic vision for the company and we intend to exit this work over time. This work is largely focused on determining whether certain content violates client standards — and can involve objectionable materials.”
This comes after a February 2019 exposé by The Verge which revealed that content moderators at the Phoenix are diagnosed with post-traumatic stress syndrome after having to view graphic and disturbing images as part of their daily duties. In addition to this other employees claim that they feared for their safety after being threatened by coworkers who were suffering, mentally, due to the work.
A second report released in June 2019 revealed that staff at the site in Tampa were mistreated by managers after they broke their non-disclosure agreements to highlight the poor working conditions.
Facebook, and a range of content moderation service providers, are now facing a range of legal actions in Ireland, the US and around the world due the psychological trauma suffered by content moderators. The number of moderators included in these compensation actions continues to grow on a weekly basis.