FreeStuffsNG: 3:27pm On Nov 12, 2024 |
'I was moderating hundreds of horrific and traumatising videos'
Zoe Kleinman
Over the past few months the BBC has been exploring a dark, hidden world – a world where the very worst, most horrifying, distressing, and in many cases, illegal online content ends up.
Beheadings, mass killings, child abuse, hate speech – all of it ends up in the inboxes of a global army of content s.
You don’t often see or hear from them – but these are the people whose job it is to review and then, when necessary, delete content that either gets reported by other s, or is automatically flagged by tech tools.
The issue of online safety has become increasingly prominent, with tech firms under more pressure to swiftly remove harmful material.
And despite a lot of research and investment pouring into tech solutions to help, ultimately for now, it’s still largely human s who have the final say.
s are often employed by third-party companies, but they work on content posted directly on to the big social networks including Instagram, TikTok and Facebook.
They are based around the world. The people I spoke to while making our series The s for Radio 4 and BBC Sounds, were largely living in East Africa, and all had since left the industry.
Their stories were harrowing. Some of what we recorded was too brutal to broadcast. Sometimes my producer Tom Woolfenden and I would finish a recording and just sit in silence.
“If you take your phone and then go to TikTok, you will see a lot of activities, dancing, you know, happy things,” says Mojez, a former Nairobi-based who worked on TikTok content. “But in the background, I personally was moderating, in the hundreds, horrific and traumatising videos.
“I took it upon myself. Let my mental health take the punch so that general s can continue going about their activities on the platform.”
There are currently multiple ongoing legal claims that the work has destroyed the mental health of such s. Some of the former workers in East Africa have come together to form a union.
“Really, the only thing that’s between me logging onto a social media platform and watching a beheading, is somebody sitting in an office somewhere, and watching that content for me, and reviewing it so I don’t have to,” says Martha Dark who runs Foxglove, a campaign group ing the legal action.
Mojez, who used to remove harmful content on TikTok, looks directly at a close camera
Mojez, who used to remove harmful content on TikTok, says his mental health was affected
In 2020, Meta then known as Facebook, agreed to pay a settlement of $52m (£40m) to s who had developed mental health issues because of their jobs.
The legal action was initiated by a former in the US called Selena Scola. She described s as the “keepers of souls”, because of the amount of footage they see containing the final moments of people’s lives.
The ex-s I spoke to all used the word “trauma” in describing the impact the work had on them. Some had difficulty sleeping and eating.
One described how hearing a baby cry had made a colleague panic. Another said he found it difficult to interact with his wife and children because of the child abuse he had witnessed.
Information and can be found via BBC Action Line
I was expecting them to say that this work was so emotionally and mentally gruelling, that no human should have to do it – I thought they would fully the entire industry becoming automated, with AI tools evolving to scale up to the job.
But they didn’t.
What came across, very powerfully, was the immense pride the s had in the roles they had played in protecting the world from online harm.
They saw themselves as a vital emergency service. One says he wanted a uniform and a badge, comparing himself to a paramedic or firefighter.
“Not even one second was wasted,” says someone who we called David. He asked to remain anonymous, but he had worked on material that was used to train the viral AI chatbot ChatGPT, so that it was programmed not to regurgitate horrific material.
“I am proud of the individuals who trained this model to be what it is today.”
Martha Dark campaigns in of social media s
But the very tool David had helped to train, might one day compete with him.
Dave Willner is former head of trust and safety at OpenAI, the creator of ChatGPT. He says his team built a rudimentary moderation tool, based on the chatbot’s tech, which managed to identify harmful content with an accuracy rate of around 90%.
“When I sort of fully realised, ‘oh, this is gonna work’, I honestly choked up a little bit,” he says. “[AI tools] don't get bored. And they don't get tired and they don't get shocked…. they are indefatigable.”
Not everyone, however, is confident that AI is a silver bullet for the troubled moderation sector.
“I think it’s problematic,” says Dr Paul Reilly, senior lecturer in media and democracy at the University of Glasgow. “Clearly AI can be a quite blunt, binary way of moderating content.
“It can lead to over-blocking freedom of speech issues, and of course it may miss nuance human s would be able to identify. Human moderation is essential to platforms,” he adds.
“The problem is there’s not enough of them, and the job is incredibly harmful to those who do it.”
We also approached the tech companies mentioned in the series.
A TikTok spokesperson says the firm knows content moderation is not an easy task, and it strives to promote a caring working environment for employees. This includes offering clinical , and creating programs that s' wellbeing.
They add that videos are initially reviewed by automated tech, which they say removes a large volume of harmful content.
Meanwhile, Open AI - the company behind Chat GPT - says it's grateful for the important and sometimes challenging work that human workers do to train the AI to spot such photos and videos. A spokesperson adds that, with its partners, Open AI enforces policies to protect the wellbeing of these teams.
And Meta - which owns Instagram and Facebook - says it requires all companies it works with to provide 24-hour on-site with trained professionals. It adds that s are able to customise their reviewing tools to blur graphic content.
The s is on BBC Radio 4 at 13:45 GMT, Monday 11, November to Friday 15, November, and on BBC Sounds.
https://www.bbc.com/news/articles/crr9q2jz7y0o
5 Likes 2 Shares 
|
FreeStuffsNG: 3:28pm On Nov 12, 2024 |
You don’t often see or hear from them – but these are the people whose job it is to review and then, when necessary, delete content that either gets reported by other s, or is automatically flagged by tech tools.
“The problem is there’s not enough of them, and the job is incredibly harmful to those who do it.”
It is. Thumbs up to all our amazing s here on Nairaland ❤️
35 Likes 1 Share |
illicit(m): 3:31pm On Nov 12, 2024 |
Oh
1 Like |
Frigga13: 3:40pm On Nov 12, 2024 |
FreeStuffsNG: You don’t often see or hear from them – but these are the people whose job it is to review and then, when necessary, delete content that either gets reported by other s, or is automatically flagged by tech tools.
“The problem is there’s not enough of them, and the job is incredibly harmful to those who do it.”
It is. Thumbs up to all our amazing s here on Nairaland ❤️
Nairaland with its tribal and party d s … in politics section.
For those doing great job in other section.. real kudos and much love .
Those politics Mods …. Can’t and won’t kill the spirit .
We still here .
19 Likes 2 Shares |
saddler: 3:45pm On Nov 12, 2024 |
|
Abbeyme: 3:45pm On Nov 12, 2024 |
Yet, you would turn the narrative another way to suit your paycheque
2 Likes |
Kimcutie: 3:46pm On Nov 12, 2024 |
Hmm
|
Tradepunter2: 3:47pm On Nov 12, 2024 |
Ok
|
EmperorCaesar(m): 3:47pm On Nov 12, 2024 |
Frigga13:
Nairaland with its tribal and party d s … in politics section.
For those doing great job in other section.. real kudos and much love .
Those politics Mods …. Can’t and won’t kill the spirit .
We still here .

Empty comment
Go to that same BBC section and see how commenters keep bashing their mods too but they dont go about wailing like this
At 10, I was dropping better and more creative response that this shii
Coming from an adult like you makes it
5 Likes 2 Shares 
|
CyracksMrBlogger(m): 3:48pm On Nov 12, 2024 |
Lies joor
1 Like |
MEGAWATCH: 3:48pm On Nov 12, 2024 |
This is why Tinubu said that if you open a Facebook for him, that he will die within few hours.
What a useless leader?
🤣🤣🤣🤣🤣🤣🤣
11 Likes 1 Share |
dumahi(m): 3:48pm On Nov 12, 2024 |
 A lot go on in the dark.
|
finallybusy: 3:48pm On Nov 12, 2024 |
There is nothing new here. I’ve known of this since 2005. BBC, once again, is mass beating a dead horse.
1 Like |
Mindlog: 3:49pm On Nov 12, 2024 |
Such jobs need regular therapy sessions and paid intermittent leaves as part of their welfare package.
2 Likes |
NothingDoMe: 3:50pm On Nov 12, 2024 |
I don't understand.
|
Ringstonermasks: 3:50pm On Nov 12, 2024 |
Are dey not well paid...?
Is the horror in Nigeria so called T-Pain Regime not worse than America horror movie ?
4 Likes 1 Share |
Frigga13: 3:50pm On Nov 12, 2024 |
EmperorCaesar:

Useless comment of the day
When i was 10, I would drop a better and more creative response that this shii
And this your comment is useful to you…
You get sense when you were 10 than your adult ..need to go back to your 10.
Maybe your brain has gone sour?
Yoruba being ..
8 Likes |
Christ4ever: 3:51pm On Nov 12, 2024 |
MEGAWATCH:
This is why Tinubu said that if you open a Facebook for him, that he will die within few hours.
What a useless leader?
🤣🤣🤣🤣🤣🤣🤣
His ers are more useless
2 Likes |
TheChameleon: 3:51pm On Nov 12, 2024 |

I can assure you, I was traumatised with the different sizes I saw when I was of Xvide0s.
15 Likes 1 Share |
TheChameleon: 3:51pm On Nov 12, 2024 |
Okay ooo
|
TJOS(m): 3:52pm On Nov 12, 2024 |
G
|
Houseofglam7(f): 3:53pm On Nov 12, 2024 |
🫤
|
SaturnNick(m): 3:53pm On Nov 12, 2024 |
Wow
|
Flangelo12: 3:54pm On Nov 12, 2024 |
Nairaland used to display gory images willy nilly.
|
Nwaikpe: 3:54pm On Nov 12, 2024 |
IDF has not only proven to be stronger than allah, but have proven that allah is weak.
allahu abkar.
4 Likes 1 Share |
jmoore(m): 3:56pm On Nov 12, 2024 |
Some people don't know that you must not watch every video that comes your way.
One man wanted me to see a video of terrorist beheading someone that was sent to his whatsapp, I told him that I don't watch such videos. This man went ahead to narrate the content as a yeye storyteller.
You see adults typing 'send me the video'. And they will start regretting why they watched it when they knew the content is horrific.
11 Likes |
Angelfrost(m): 3:56pm On Nov 12, 2024 |
I have often wondered about the mental state of those who moderate and screen such submitted videos...!
Thank God for A.I. these days.
|
bigdammyj: 3:58pm On Nov 12, 2024 |
Noted.
|
Eriokanmi: 4:03pm On Nov 12, 2024 |
All we have are biased s on NL with the exception of one or 2. They either bar you for days needlessly over comments which they have personal dislike for or some of your posts don't get to the front page, if they're written against their preferred personalities
4 Likes 1 Share |
blingxx(m): 4:03pm On Nov 12, 2024 |
Ever heard of the Darkweb ? Lol
|
trium: 4:04pm On Nov 12, 2024 |
It's a distressing job to do. Nairaland is a case study.
As a member, you can see how terrible thr forum is run because mods allow people get away with murder. But can I blame them? Not entirely. Seun thinks this thing he wrote is sufficient:
Disclaimer: Every Nairaland member is solely responsible for anything that he/she posts or s on Nairaland
My dear, not entirely! The day govt decides to be on your case, you go Mark your Zuckerberg


|
Gadafii: 4:05pm On Nov 12, 2024 |
Na the job of mynd be that, but the guy himself is a bigOt
|