A Facebook moderator says she took down beheadings, child pornography and animal abuse every day — but was 'treated like nothing'
Filed under: Finance, Business, Companies
- A former Facebook moderator revealed the disturbing imagery she had to remove from the site everyday.
- In an anonymous interview with the BBC, she said she had to make quickfire takedown decisions about photos and videos containing beheadings, animal abuse, and child pornography.
- The content reviewer said the work gave her nightmares and criticised Facebook for the lack of support it provided staff.
- Facebook said graphic content is a "fraction" of what needs to reviewed, and it is committed to giving moderators the tools to do the job well.
A former Facebook moderator has revealed the horrors she was exposed to every day — and criticised the social network for not doing enough to support staff handling disturbing imagery.
"I had nightmares a couple of times. I remember one for example: People jumping from a building, I don't know why. And I remember people, instead of helping the people jumping, they were just taking photos and videos... I woke up crying."
- A single high school in India has produced the CEOs of Microsoft, Adobe, and Mastercard
- 25 of the most dangerous things science has strongly linked to cancer
- The data scientist behind the Cambridge Analytica scandal did paid consultancy work for Facebook and has close ties to staff