The film The Cleaners is about Filipino content moderators who go through thousands of graphic images every day so that you and I never have to see them on social media platforms. It will be featured in Hong Kong next month.

The 8th Human Rights Documentary Film Festival, organized by Amnesty International, is showing The Cleaners as its closing film. Running from September 26 to October 4, seven films in all will be shown.

Hans Block and Moritz Riesewieck, directors from Germany, made a movie about the stories of the people who have to look at as many as 25,000 violent or explicit images and videos every day in order to keep these disturbing images from Facebook and other social media platforms.

They work as much as 10 hours a day looking at such content as the sexual abuse of children, people committing self-harm, or terrorist attacks on computer screens, pushing either an “ignore” or a “delete” button. Hired by outsourcing companies, these content moderators are exposed to such disturbing content as a child being molested by an older man, or a terrorist beheading his prisoner day after day.

See also  Restaurant fires employee after netizen posts receipt with racist comment on Facebook

The movie highlights the work of these moderators, as well as the price they pay mentally and emotionally for screening social media for the rest of the world.

Moritz Riesewieck said, “For most of them, it’s a job they are proud of. You have to consider many people have jobs that are much less prestigious than working in such nice looking buildings in one of the best parts of Manila.”

Their pay, while decidedly low for western standards, is still better than salaries that other jobs offer and enables them to support their families.

Messrs. Block and Riesewieck’s journey in making The Cleaners began in 2013, when they saw that a video showing a young girl molested by an elderly man had been shared more than 16,000 times on Facebook before being taken down. This led them to wonder about the process Facebook employs with removing content, whether it’s done by people or through an algorithm.

To their surprise, when they examined the job descriptions in the Philippines for these positions, they found that they did not really describe what the work entailed. According to Mr. Riesewieck, the jobs were for “community operations analysts”, or “data analysts with international clients” but made no mention of moderating content.

See also  Kaspersky: Information shared online becomes one’s reputation and creates an impact in the real world

The people who signed on only found out the true nature of their work after they had signed a contract, on the first day of their training, viewing images of pornography or abuse in front of them. By then, it was too late to opt out.

However, Messrs. Block and Riesewieck say that many of the content moderators they met in Manila are proud of the work they do, since it means keeping social media free from disturbing content.

The 15 to 20 content moderators the directors interviewed for The Cleaners had signed non-disclosure agreements, and did not give their real names. Nor did they identify which social media sites hired them.

One content moderator says that he sees 25,000 images a day, something that he said should get him in the Guiness Book of World Records.

Another talked about being proud of his work.  “I am passionate about my job. I like what I am doing. As a moderator, you’re like security. You protect the user.”

See also  NTUC FairPrice receives flak despite apology after their staff told Muslim couple 'Not for India, don’t take. Go away!' from taking free iftar snacks

However, the dark side to this kind of work is the inner stress the moderators are subjected to because of the images and videos they view day after day. Studies have shown that such work can affect the brain negatively, in the sense of normalizing violence.

One moderator who had focused on removing videos of self-harm was found to have committed suicide, hanging himself at home.

Facebook claimed in July that it has plans to increase its safety and security team this year to 20,000, double the current number. The social media giant has employed four clinical psychologists in three regions who are in charge of resiliency programs for their content moderators who work with disturbing images daily.

According to Ellen Silver, Facebook’s vice president of operations, “This job is not for everyone – so to set people up for success, it’s important that we hire people who will be able to handle inevitable challenges that the role represents. Just as we look for language proficiency and cultural competency, we also screen for resiliency.”