Psychological Effects of High-Stress Digital Labour: Content Moderation
Rapid technological advances and widespread internet access have created a global market for digital labour. This market encompasses a vast range of online jobs, from creative tasks like software development to routine work like data entry. While this model offers flexibility for both companies and workers, it often blurs traditional employment boundaries, raising critical questions about worker rights and protections.
Microwork and Digital Sweatshops
A key component of this market is microwork, a practice where large projects are broken down into small, repetitive tasks that require few specialized skills. These tasks are typically outsourced to a large, often international, workforce through online platforms.
Although this model provides employment opportunities, critics argue it creates exploitative conditions similar to traditional sweatshops, leading to the term digital sweatshops. Workers in these environments often face low pay, job insecurity, and a lack of benefits. For example, platforms like Amazon Mechanical Turk and Clickworker assign tasks such as data labelling and survey participation, paying workers per task. This structure frequently results in inconsistent and very low earnings.
Content Moderation as High-Stress Microwork
Content moderation is an essential service that protects the safety and integrity of online platforms. Moderators review user-generated content to ensure it complies with community guidelines, removing harmful or inappropriate material. This vital function is often structured as microwork, with conditions that resemble those of digital sweatshops.
Major technology companies (e.g., Meta, Google, TikTok) typically outsource their content moderation to third-party firms. These contractors often operate in countries in the Global South, such as Kenya and the Philippines, where labour costs are lower. Workers in these locations review enormous volumes of disturbing and explicit content, shielding users from harmful material. This outsourcing model allows major tech companies to distance themselves from the challenging labour conditions their moderators face.
The Psychological Toll Content Moderation
The nature of content moderation—requiring constant exposure to distressing material—can have severe psychological consequences. Many moderators report symptoms consistent with post-traumatic stress disorder (PTSD), including anxiety, depression, and emotional numbness. Continuous exposure to graphic content also creates a risk of desensitization, which can negatively impact personal relationships and overall well-being.
These psychological risks are compounded by a high-pressure work environment defined by strict performance quotas and inadequate support, leading to burnout and high turnover rates. Furthermore, strict non-disclosure agreements (NDAs) often forbid moderators from discussing their work, which isolates them and prevents them from seeking help. Testimonies from moderators, such as those from a former contractor for Meta in Kenya, highlight the urgent need for better mental health support, improved working conditions, and greater accountability from the tech giants that rely on this hidden labour.
Content Moderator for Facebook Shares Her Experience
Sonia Kgomo, an organiser with African Tech Workers Rising, and formerly a content moderator for Facebook, has recently shared her experience at the British daily, The Guardian, as follows:
"A mum of two young children, I was recruited from my native South Africa with the promise to join the growing tech sector in Kenya for a Facebook subcontractor, Sama, as a content moderator. For two years, I spent up to 10 hours a day staring at child abuse, human mutilation, racist attacks and the darkest parts of the internet so you did not have to.
It was not just the type of content I had to watch that gave me insomnia, anxiety and migraines, it was the quantity too. In Sama we had something called AHT, or action handling time. This was the amount of time we were given to analyse and rate a piece of content. We were being timed, and the company measured our success in seconds. We were constantly under pressure to get it right.
You could not stop if you saw something traumatic. You could not stop for your mental health. You could not stop to go the bathroom. You just could not stop. We were told the client, in our case Facebook, required us to keep going.
This was not the life I imagined when I moved to Nairobi. Isolated from my family, my only real community was my colleagues at Sama and other outsourcing companies. When we gathered, our conversations always circled back to the same thing: our work, and the way it was breaking us.
The more we talked, the more we realised something was happening that was bigger than our personal stories. Every content moderator, data annotator and AI worker we met had the same stories: impossible quotas, profound trauma and a disregard for our wellbeing.
It was not just a Sama problem. It was not just a Facebook problem. It was the way the entire tech industry operated – outsourcing the most brutal digital labour and profiting from our pain." (Kgomo, Sonia. 2025. I was a content moderator for Facebook. I saw the real cost of outsourcing digital labour, The Guardian, Feb 12. Emphases added.)
Wage Levels
Content moderators in the Global South, such as Kenya, have reported wages as low as $1.50 per hour, whereas their counterparts in the United States typically earn between $15 and $20 per hour, often with additional benefits. Although direct wage comparisons should consider purchasing power parity and local living costs, the gap is significant nonetheless. Recent legal cases highlight concerns around these wage disparities and working conditions, according to Time magazine:
"The two-year legal battle stems from allegations of human rights violations at an outsourced Meta content moderation facility in Nairobi, where employees hired by a contractor were paid as little as $1.50 per hour to view traumatic content, such as videos of rapes, murders, and war crimes. The suits claim that despite the workers being contracted by an outsourcing company, called Sama, Meta essentially supervised and set the terms for the work, and designed and managed the software required for the task. Both companies deny wrongdoing and Meta has challenged the Kenyan courts' jurisdiction to hear the cases. But a court ruled in September that the cases could each proceed. Both appear likely to go to trial next year, unless the Kenyan Supreme Court intervenes. ... If successful, the lawsuits could enshrine a new precedent into Kenyan law that Big Tech companies – not just their outsourcing partners – are legally liable for any wrongdoing that happens inside subcontracted facilities." (Perrigo, Billy. 2024. Kenya’s President Wades Into Meta Lawsuits, Time, December 11)
A Modern Parallel: Digital Labour and Textile Sweatshops
The exploitative conditions in digital sweatshops across the Global South mirror the historical dynamics of textile and garment factories, particularly within the context of North-South relations. While the work has shifted from physical production to digital tasks, the underlying business model remains largely the same. Companies, often based in the Global North, capitalize on lower labour costs, weaker regulatory environments, and minimal worker protections in the Global South to maximize profits.
Just as in the traditional textile industry, this model thrives on limited transparency and oversight, which allows substandard working conditions to persist. The combination of an abundant labour supply and the ability to sidestep stronger labour laws in their home countries creates a system with few restrictions. This dynamic highlights a persistent challenge in the global economy: how to uphold fair and humane labour practices when significant economic and power imbalances continue to shape the market.
Conclusion
The digital economy's reliance on exploitative labour practices reveals a troubling continuity with historical patterns of global inequality, now manifested through screens rather than factory floors. As technology companies continue to expand their reach and influence, the human cost of maintaining safe online spaces remains largely invisible to end users who benefit from these services. The testimonies of workers and the ongoing legal challenges in Kenya signal a growing resistance to these practices and demand for corporate accountability.
The structural nature of this exploitation presents formidable barriers to meaningful change. Tech companies' ability to shift operations between jurisdictions, combined with the asymmetrical power dynamics between Global North corporations and Global South workers, creates a system that appears largely impervious to traditional regulatory approaches. The very architecture of digital outsourcing, designed to maximize profit margins through labour arbitrage, suggests that the psychological toll on content moderators may be an inherent feature rather than an unintended consequence of this economic model.
This reality raises fundamental questions about whether the current trajectory of digital capitalism can ever adequately protect worker well-being, or whether the human costs documented in places like Kenya and the Philippines represent the inevitable price of maintaining the global digital infrastructure upon which billions now depend.
Video
An exposé on the traumatic conditions faced by Facebook's content moderators. [13m 32s]
Former moderators describe their experience reviewing graphic and disturbing material for little pay and with inadequate mental health support. The work involved constant exposure to violent and exploitative content in a high-pressure environment with unrealistic performance expectations. They reveal that the stressful job was compounded by an unsanitary and toxic workplace, leaving them with lasting psychological trauma.
Discussion
1. The text draws a direct parallel between modern digital sweatshops and historical textile sweatshops. How effective is this analogy? In what ways does the comparison hold up, and in what crucial ways do the two forms of labour differ?
2. As end-users of social media platforms, we benefit from a sanitized online environment largely because of content moderators. What responsibility, if any, do consumers have in this system? Can consumer pressure or awareness campaigns realistically lead to better conditions for these hidden workers?
Critical Thinking
1. The conclusion suggests that the psychological harm experienced by moderators is an inherent feature of the business model, not an unintended consequence. Challenge or defend this assertion. What specific structural changes would a company have to make to prove this conclusion false?
2. The text suggests that the abundant labour supply in the Global South enables these exploitative practices. Research and analyze: How do global economic inequalities create conditions where workers accept harmful employment? What would need to change systemically to alter these dynamics?
Further Investigation
1. Technological Solutions: Research the role of Artificial Intelligence (AI) in content moderation. Is AI a potential solution that could reduce the psychological burden on humans, or does it primarily function as a tool that creates new forms of high-stress digital work (e.g., training the AI by labeling graphic content)? Present findings on the current capabilities and limitations of AI in this field.
2. The Future of Digital Work: Given the trends described in this text, write a speculative piece imagining digital labor conditions in 2040. Will the problems described be solved, worsened, or transformed into new challenges? Ground your speculation in current technological developments, economic trends, and social movements. Consider multiple scenarios: What would happen if current legal challenges succeed? If they fail? If new technologies like advanced AI change the nature of content moderation entirely?
Notes: Country data were sourced from the International Monetary Fund (IMF) and the CIA World Factbook; maps are from Wikimedia, licensed under Creative Commons Attribution-ShareAlike (BY-SA). Rights for embedded media belong to their respective owners. The text was adapted from lecture notes and reviewed for clarity using Claude.
Last updated: Fall 2025