Online Content Moderation

We all might have hear about the fancy algorithm developed by big tech companies to police contents uploaded on to their sites by users and how it takes (as much as possible) humans out of equation but what happens when the algorithm encounters a potential exception condition?

When that happens, you will need a pair of human eyes to determine if the content is permissible by the site’s Acceptable Use Policy.

I came across 2 articles posted by Wired about these little known Behind-the-Scene tech workers working around the clock, across different time zones filtering out materials that are flagged by their automated counterpart. Due to the amount of flagged contents coming through, tight labour and possibly OHS laws in developed countries, content moderation are usually offshored to overseas companies in India and the Philippines where there are plenty of cheap labour ready to take on whatever jobs that comes their way.

This is a thankless job, not just physically but also mentally. The amount of mind-scaring content that these guys have to go through over their course of work, just to keep the rest of just safe, will no doubt stay with them for the rest of their lives.

To find out the sort of training and stuff that these guys go through, click on the links below to read their story.

Watch Workers Learn How to Filter Obscene and Violent Photos From Dating Sites

The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed