Skip to menu Skip to content Skip to footer
News

Judgement call: the unseen pressures on the people who police the internet

8 August 2025
A line of people sitting at computer screens in an office.

(Photo credit: GCShutter/Adobe. )

Key points

  • Online content moderators in India have high productivity targets, assessing around 1,500 items per 8-hour shift.
  • They often prioritise items that take the least time to assess, which can lead to problematic content staying online, or non-offensive content being removed.
  • Despite dealing with often disturbing images and footage, there is minimal mental health support.

The harsh working conditions of human online content moderators adversely affects internet content, research has shown.

Tania Chatterjee, a joint PhD student from The University of Queensland’s School of Communication and Arts and the Indian Institute of Technology Delhi led a study of the decision-making processes of social media moderators in India.

“Human moderators filter content flagged by an algorithm or users as potentially problematic,” Ms Chatterjee said. 

“This important work is often outsourced by online platforms, and the moderators I interviewed in India assessed content from the United States, South America, Europe and the Middle East.

“What quickly became apparent was the immense pressure and workloads on these people who are employed to keep the internet safe.”

Ms Chatterjee said the moderators in the study were routinely given around 1,500 items to review in an 8-hour shift.

“This means they have just 15-20 seconds to make a contextual judgement with high accuracy,” she said.

“They’re also expected to follow extensive guidelines provided by the platform, often internal and considered trade secrets.

“Meeting targets with accuracy under their employment conditions just isn’t reasonable.”

Ms Chatterjee said moderators commonly made content decisions simply based on what would take the least amount of time.

“Through screensharing I saw how moderators prioritised content they could assess quickly, which left the more complex cases outstanding,” she said.

“In some platforms, problematic content was more likely to be removed altogether because it was a 2-step process, instead of being de-ranked to be less visible online, which took 4-steps.

“The outcome is two-fold – problematic content that should be removed stays online, and content that’s innocent when appropriate context is applied is being removed.

“Moderators are also more likely to rely on automation tools to remove flagged words or phrases, because it’s quicker than making an independent assessment.”

Ms Chatterjee said the moderators they studied were low-paid, given limited job training and rarely accessed mental health support – if it was on offer at all.

“With an employment crisis in India, workers are unlikely to complain about their labour conditions and many moderators are new university graduates getting what they think is a foot in the door in the digital space,” she said.

“Online platforms really need to re-think moderator targets and implement some simple design changes in their portals to streamline processes.

“They should also be more transparent about how much they spend on human moderators, both in-house and outsourced, to ensure it’s proportionate to the amount of content they host. 

“Human content moderators have a crucial role in policing the internet, but our research shows how harsh employment conditions shape the outcomes of their work.”

The research was published in New Media & Society.

Related articles

The pedals of an e-bike with a person's feet in trainers ready to ride.
Opinion

E-bikes could slash our reliance on cars – but overpowered illegal models on the roads make us all less safe

Authorities are grappling with a tide of overpowered e-bikes being used illegally on our roads - so making the best use of these vehicles will also have to include clearer, tighter regulations.
18 August 2025
A painting of little cherubs.
Opinion

Not quite angels: why we should stop calling these small winged children ‘cherubs’

We are all familiar with cherubs – small, winged children that have a status in Western art history as angels but did you know this image is unlike the cherubs of the biblical and medieval traditions?
15 August 2025

Media contact

Subscribe to UQ News

Get the latest from our newsroom.