Every day, 26-year-old Monsumi Murmu watches at least 800 videos and images of violence, sexual abuse and harm in her village in Jharkhand, getting paid roughly Rs 20,000 a month to do so. She is one of thousands of Indian women working as content moderators for global technology companies, reviewing explicit material flagged by algorithms to train artificial intelligence (AI) systems, according to a report in The Guardian.
Murmu does this work from her home’s veranda in one of the few places where a mobile signal is accessible. Balancing her laptop on a mud slab built into the veranda wall, she logs in from her home to watch hours of pornographic, explicit content flagged by a computer program to classify as possible violations.
The content moderator industry shows that even with the recent breakthroughs in machine learning, AI still heavily relies on the data it is trained on. In India, it is mostly women who are involved in this labour, where they are also called “ghost workers,” according to The Guardian.
Murmu described how the initial months of being a “ghost worker” destroyed her sleep, with the images still following her in dreams. “The first few months, I couldn’t sleep,” she told The Guardian. “I would close my eyes and still see the screen loading.”
These content moderators are often made to watch explicit content, not limited to sexual abuse, and include visuals of someone losing their family members and fatal accidents. These visuals are not easy to forget. The nights when her mind is plagued by those visuals, her mother sits beside her, she said.
However, Murmu was soon desensitised, with the images no longer shocking her the way they did. “In the end, you don’t feel disturbed – you feel blank.” There are still a few nights when she has those dreams. “That’s when you know the job has done something to you.”
Sociologist Milagros Miceli said emotional numbing is a key characteristic of content moderation work. “There may be moderators who escape psychological harm, but I’ve yet to see evidence of that,” she told The Guardian.
She said Murmu’s work belongs in the dangerous work category, “comparable to any lethal industry.” Several studies show that content moderation work leads to behavioural changes with permanent emotional strain. These workers have reported heightened attentiveness, anxiety and disturbances in sleep patterns.
Raina Singh was 24 when she began working in the data annotation industry. Data annotating, similar to content moderation, is the process of tagging content to help machines interpret data correctly. After graduating, she was planning to teach, but having a monthly income felt more necessary.
She returned home to Bareilly, Uttar Pradesh, and started working through a third-party organisation contracted with a global technology platform. Even though it had an unclear job description, the work seemed manageable with a pay of around Rs 35,000.
Her assignments in the beginning were mostly text-based tasks, such as reviewing short messages, flagging scams or detecting scam-like language. But after six months, the assignments dramatically changed. With no notice issued, she was transferred to a project with links to an adult entertainment site. Her work now became about flagging and removing content that had child sexual abuse.
“I had never imagined this would be part of the job,” Singh said, adding that her complaints about the content fell on deaf ears. Her manager once said in response, “This is God’s work – you’re keeping children safe.”
Again, she and six other team members were shifted to a different project. This time, they were directed to sort out pornographic content. “I can’t even count how much porn I was exposed to,” she says. “It was constant, hour after hour,” she told The Guardian.
The work evidently began seeping into her personal life, where she describes that now, “the idea of sex started to disgust me.” She felt extremely removed from intimacy as a concept, and even started disconnecting from her partner.
When she raised concerns, the response was corporate: “Your contract says data annotation – this is data annotation.” Even a year after leaving the job, she said the thought of sex still gives her nausea or sometimes she dissociates. “Sometimes, when I’m with my partner, I feel like a stranger in my own body. I want closeness, but my mind keeps pulling away.”
According to AI and data labour researcher Priyam Vadaliya, the job descriptions rarely have actual information about the work. “People are hired under ambiguous labels, but only after contracts are signed and training begins do they realise what the actual work is,” she told The Guardian.
The remote or part-time work is extensively promoted as “easy money” opportunities, mostly circulated through YouTube videos, Telegram channels or LinkedIn posts and influencer tutorials which reconstruct the work as safe and flexible, with not as much skill involved.
Out of the eight Indian data-annotation and content-moderation companies that The Guardian spoke to, two said they provide psychological help. The others said the work was not challenging enough to need mental healthcare.
Even when there is support, the worker has to look for help, which “ignores the reality that many data workers, especially those coming from remote or marginalised backgrounds, may not even have the language to articulate what they are experiencing,” researcher Vadaliya said.
Since India’s labour laws have no formal recognition of psychological harm, it leaves the workers without proper guardrails.
Isolation is another factor affecting the mental toll on content moderators and data workers. They are often told to sign non-disclosure agreements (NDA), effectively restricting them from speaking about their work with their family or friends.
Murmu said she feared explaining her work since it would mean her family understanding what she does, forcing her to stop earning and enter marriage, like other girls in her village.
More than her mental health, she is concerned about finding another job. There are four months left on her contract with the tech company that pays her approximately Rs 20,000. “Finding another job worries me more than the work itself,” she said.
She came to terms with the toll in other ways, sitting with nature for hours to calm her mind. “I go for long walks into the forest. I sit under the open sky and try to notice the quiet around me,” she said. “I don’t know if it really fixes anything. But I feel a little better.”
This post was last modified on February 21, 2026 11:06 pm