High-Stress Digital Labour

return to the list of Cases



The rapid advancement of technology and the expansion of internet connectivity have paved the way for a global digital labour market. This market includes a wide range of online tasks—from software development and graphic design to routine activities like data entry and transcription. While digital labour provides flexibility and enables businesses to tap into a global workforce, it often blurs traditional employment boundaries, raising complex questions about labour rights and protections.

Within the digital labour sphere, the terms digital sweatshops and microwork have come to characterize certain labour practices. Microwork refers to breaking down large projects into numerous small, repetitive tasks that require minimal specialized skills. These tasks are typically outsourced to a vast, often international, workforce through online platforms.

Although this model can offer employment opportunities, critics argue that it fosters exploitative conditions akin to traditional sweatshops—hence the term “digital sweatshops.” Workers in these settings frequently contend with low pay, job insecurity, and a lack of benefits. For example, platforms such as Amazon Mechanical Turk and Clickworker assign tasks like data labeling, content tagging, and survey participation, paying workers per completed task. This pay structure can lead to inconsistent and often meager earnings.



Content Moderation as a Type of Microwork

Content moderation plays a vital role in ensuring the safety and integrity of online platforms by reviewing user-generated content for compliance with community guidelines and removing harmful or inappropriate material. Increasingly, this function is carried out as a form of microwork under conditions reminiscent of digital sweatshops. Major technology companies—including Meta (formerly Facebook), Google, and TikTok—outsource their content moderation to third-party firms. These firms frequently operate in countries in the Global South, such as Kenya and the Philippines, where labour is more cost-effective.

Workers there review vast amounts of often disturbing and explicit content to protect users from harmful exposure. Despite the importance of their role, these moderators commonly endure low wages, minimal job security, and inadequate mental health support. By outsourcing these tasks, tech giants maintain plausible deniability regarding labour practices, distancing themselves from the day-to-day realities of moderation work.



Problems Associated with Content Moderation

The nature of content moderation requires continuous exposure to distressing material, which can lead to serious psychological effects. Moderators may experience symptoms similar to post-traumatic stress disorder (PTSD), such as anxiety, depression, and emotional numbness. Constant exposure to graphic content also risks desensitization, potentially impacting personal relationships and overall well-being.





Furthermore, a high-pressure work environment—characterized by strict performance metrics and limited support—can contribute to burnout and high turnover. Non-disclosure agreements often prevent moderators from discussing their work, creating additional isolation and hindering them from seeking help. A former Facebook content moderator in Kenya underscored these challenges, highlighting the urgent need for stronger mental health protections and professional standards.



Sonia Kgomo Shares Her Experience

Sonia Kgomo, an organiser with African Tech Workers Rising, and formerly a content moderator for Facebook, has recently shared her experience at the British daily, The Guardian, as follows:

"A mum of two young children, I was recruited from my native South Africa with the promise to join the growing tech sector in Kenya for a Facebook subcontractor, Sama, as a content moderator. For two years, I spent up to 10 hours a day staring at child abuse, human mutilation, racist attacks and the darkest parts of the internet so you did not have to.

It was not just the type of content I had to watch that gave me insomnia, anxiety and migraines, it was the quantity too. In Sama we had something called AHT, or action handling time. This was the amount of time we were given to analyse and rate a piece of content. We were being timed, and the company measured our success in seconds. We were constantly under pressure to get it right.

You could not stop if you saw something traumatic. You could not stop for your mental health. You could not stop to go the bathroom. You just could not stop. We were told the client, in our case Facebook, required us to keep going.

This was not the life I imagined when I moved to Nairobi. Isolated from my family, my only real community was my colleagues at Sama and other outsourcing companies. When we gathered, our conversations always circled back to the same thing: our work, and the way it was breaking us.

The more we talked, the more we realised something was happening that was bigger than our personal stories. Every content moderator, data annotator and AI worker we met had the same stories: impossible quotas, profound trauma and a disregard for our wellbeing.

It was not just a Sama problem. It was not just a Facebook problem. It was the way the entire tech industry operated – outsourcing the most brutal digital labour and profiting from our pain." (Kgomo, Sonia. 2025. I was a content moderator for Facebook. I saw the real cost of outsourcing digital labour, The Guardian, Feb 12. Emphases added.)


Wage Levels in the Global South

Content moderators in the Global South, such as Kenya, have reported wages as low as $1.50 per hour, whereas their counterparts in the United States typically earn between $15 and $20 per hour, often with additional benefits. Although direct wage comparisons should consider purchasing power parity and local living costs, the gap is significant nonetheless. Recent legal cases highlight concerns around these wage disparities and working conditions, according to Time magazine:

"The two-year legal battle stems from allegations of human rights violations at an outsourced Meta content moderation facility in Nairobi, where employees hired by a contractor were paid as little as $1.50 per hour to view traumatic content, such as videos of rapes, murders, and war crimes. The suits claim that despite the workers being contracted by an outsourcing company, called Sama, Meta essentially supervised and set the terms for the work, and designed and managed the software required for the task. Both companies deny wrongdoing and Meta has challenged the Kenyan courts' jurisdiction to hear the cases. But a court ruled in September that the cases could each proceed. Both appear likely to go to trial next year, unless the Kenyan Supreme Court intervenes. ... If successful, the lawsuits could enshrine a new precedent into Kenyan law that Big Tech companies – not just their outsourcing partners – are legally liable for any wrongdoing that happens inside subcontracted facilities." (Perrigo, Billy. 2024. Kenya's President Wades Into Meta Lawsuits, Time, December 11)


North-South Relations: Parallels with Textile Sweatshops

In the context of North-South relations, the exploitative conditions found in digital sweatshops across the Global South bear striking similarities to those historically seen in textile and garment factories. Although the nature of the work differs—textile production vs. digital tasks—the underlying business model is comparable: multinational companies capitalize on low-cost labour, weaker regulatory frameworks, and minimal labour protections to maximize profit margins.





As with textile sweatshops, digital enterprises operating in these regions often provide limited transparency and oversight, perpetuating substandard working conditions. The combination of abundant labour supply, lax regulations, and the ability to sidestep stricter labour laws in home countries highlights a broader challenge in global labour markets—where economic inequalities and power imbalances continue to fuel debates on how to uphold fair and humane working practices.



Conclusion

Recent lawsuits and regulatory initiatives indicate growing scrutiny of outsourced digital labour. Stakeholders in government, industry, and civil society increasingly recognize that ethical standards and accountability measures are essential for sustainable digital work environments, both in the Global North and South.



Some Questions to Think About

What are the economic and social trade-offs of outsourcing digital labour to the Global South? For example, how do employment opportunities and economic growth in these regions weigh against issues like low pay, job insecurity, and mental health challenges for workers?

What strategies can be implemented to protect the mental health of content moderators without compromising the safety and integrity of online platforms? How can companies ensure effective content moderation while reducing the psychological toll on workers?

How do the challenges in digital labour reflect broader issues in global labour markets? What insights or strategies from historical efforts to improve conditions in traditional sweatshops (e.g., textiles) could apply to the digital labour context?


return to the list of Cases

Last updated: Spring 2025