In the latest development, a report has it that Facebook
contractors stare at hours of child porn, bestiality and racism in aim to keep depravity off-site.
Equivalent of 65 years of video are uploaded to YouTube each day and Facebook gets 1 million reports of potentially objectionable content daily “I am dead serious about this,” Chief Executive Mark Zuckerberg said last month.
, the social media giant is under pressure to improve its defenses after failing during last year’s presidential campaign to detect that Russian operatives tried to use its platform to influence the outcome.
By her second day on the job, Sarah Katz knew how jarring it can be to work as a content moderator for Facebook Inc.
Read: Facebook and Twitter threatened with sanctions over Brexit ‘fake news’
She says she saw anti-Semitic speech, bestiality photos and video of what seemed to be a girl and boy told by an adult off-screen to have sexual contact with each other.
Katz, 27 years old, says she reviewed as many as 8,000 posts a day, with little training on how to handle the distress, though she had to sign a waiver warning her about what she would encounter.
Coping mechanisms among content moderators included a dark sense of humor and swiveling around in their chairs to commiserate after a particularly disturbing post.
She worked at Facebook’s FB, +0.17% headquarters campus in Menlo Park, Calif., and ate for free in company cafeterias. But she wasn’t a Facebook employee.
Katz was hired by a staffing company that works for another company that in turn provides thousands of outside workers to the social network. Facebook employees managed the contractors, held meetings and set policies.
The outsiders did the “dirty, busy work,” says Katz, who earned $24 an hour. She left in October 2016 and now is employed as an information-security analyst by business-software firm ServiceNow Inc. NOW, +0.33%
“Humans, still, are the first line of defense. Facebook, YouTube and other companies are racing to develop algorithms and artificial-intelligence tools, but much of that technology is years away from replacing people.” Eric Gilbert, a computer scientist at the University of Michigan
Deciding what does and doesn’t belong online is one of the fastest-growing jobs in the technology world—and perhaps the most grueling. The equivalent of 65 years of video are uploaded to YouTube each day.
Facebook receives more than a million user reports of potentially objectionable content a day. Facebook will have 7,500 content reviewers by the end of December, up from 4,500, and it plans to double the number of employees and contractors who handle safety and security issues to 20,000 by the end of 2018.