Perhaps Facebook does not really want content on its platform to be moderated

In an interview with RT News, Yair Cohen – Internet Law and Social Media Lawyer, discusses alongside Bill Merr, a privacy campaigner, the recent revelation about Facebook’s moderator contractors’ poor working conditions and what Facebook could possibly do to improve working conditions.

If all of the allegations are true, it tells us something about the way Facebook is treating the whole subject of content moderation. Maybe it doesn’t really want to do it.

A moderator working at the contractor company Cognizant, has described having to go through as many as 200 graphic videos per day and watching at least 15 to 30 seconds of each, including footage of animal cruelty and extreme violence. They claim to be also forced to work in unsanitary conditions under tight control with the fear of losing their job if they took sick leave (which is unpaid).

I have heard those allegations before, aired by moderators located outside of the United States. In India, in particular. When you look at the numbers, Facebook has approximately 30,000 moderators and when you think about the huge number of posts that these people have to moderate with Facebook having millions and millions of users, it is a difficult job. These 30,000 people wake up every day to screening sewerage of the most horrific images and content and without proper training, proper revision and without caring for their well being and health conditions, this is a call to action. These people deserve and are owed much more by Facebook.

It is indeed a very, very sad reflection on our society today. First of all that people are posting sickening, offensive content like this. Secondly, that Facebook has chosen to outsource this as largely a cost measure. Thirdly, these contractors are financially squeezed by Facebook and fourthly, that Cognizant are treating their staff so horrifically. It is a terrible indictment right across the spectrum.

These individuals potentially have legal cases against both Cognizant and Facebook, as they really need to take more responsibility.

Moderators need to act really fast and with a great degree of accuracy but it is failing and there are cracks in the system, as seriously inappropriate videos have gone viral on Facebook, so even if the moderators were given better working conditions, there will still be this problem but until they invest money into the technology to moderate video content and it is possible, if you look to Google and the artificial intelligence that they now have in place for Youtube, which enables the removal of millions of offensive videos, human moderators will be still needed and they need to be cared for.

In terms of resources, it is a matter for the internet company and in this case – Facebook, to ensure that there are sufficient resources for moderation and the other important thing is that first, we need to create a corporation for social media companies. For example, Google may have enough artificial intelligence to handle content because of its experience with Youtube and perhaps it is time to share that technology with Facebook and the second thing is to create a greater transparency to allow perhaps for an external body to go into Facebook and all of its processes in terms of moderation and that is the only way in which we will be able to tell whether enough resources are being pulled in, whether there are any gaps that need to be filled out and whether there is any scope for greater cooperation but it doesn’t look as if Facebook is willing to allow any of this degree of transparency or any external audit to take place, which is incredibly unfortunate.

Moderating posts on a review website. What are the requirements and what should your policy be? Legal Liability of Website Operators in the UK

Scroll to Top