Content Moderation: The harrowing, traumatizing job that left many African data workers with mental health issues and drug dependency

Exploited by subcontractors like Sama, content moderators for Meta in Nairobi face psychological trauma and silencing. This report highlights their mental health struggles and common coping mechanisms, including drug abuse and self-harm.

by Fasica Berhane Gebrekidan

Trigger Warning

This report discusses trauma, mental illness, addiction, and eating disorders. There are mentions of suicide and self-harm and descriptions of graphic content, violence, explicit sexual situations, child sexual abuse, ethnic cleansing, and animal cruelty. Readers' discretion is advised.

Social media content moderation is perceived as an easy job when, in reality, it is very hazardous to one’s mental health. In this research report, I aim to bring this issue to light. My name is Fasica Berhane Gebrekidan, and I worked for two years as a content moderator for Meta through the subcontractor Sama in Nairobi, Kenya. The report delves into what content moderators deal with in their everyday lives, the impact of the job on their mental health, and how they are neglected and taken advantage of, working under poor conditions. They are hired through subcontractor companies rather than the real clients, which are giant tech companies that directly profit from their work.  I hope this report will help unveil the dangers of the job and its irreversible effects on the workers. 

Content moderators risk their lives to train AI machines and algorithms, yet are only compensated with low pay and poor working conditions and are left with mental health issues and addiction problems. Such issues are critical as these jobs destroy many young lives. A goal of this report is to influence new policy frameworks regarding data work and content moderation. The rights of these unprotected and vulnerable data workers should be acknowledged, and governments and lawmakers should stop their exploitation through big tech companies. 

The issues faced by content moderators are not often discussed, in part because tech companies hire people through intermediaries and require secrecy. Many data workers either work through freelance contractors (agents who assign small tasks to freelance workers for a small fee) or are hired by subcontractors. This is all by design, as it ensures that the real clients – big tech companies –  have no direct contact with the workers and are therefore not held responsible for any damages. Working as a content moderator for two years and later being laid off by the subcontractor, I witnessed this firsthand. After my redundancy notice, there was no one responsible for the psychological trauma I was going through. I wanted to address this issue

I joined the Data Workers’ Inquiry project to have conversations with other data workers. I distilled a list of issues related to the working conditions and the ways giant tech companies are exploiting workers knowingly for profit and the advancement of AI. In addition, I advocate for creating awareness on the job, increasing transparency for people who want to become content moderators, and working towards more acknowledgment of the heroic act of risking one’s mental health to keep the community safe from harmful and dangerous content online. Our role of keeping platforms free of harmful content should be, at the very least, acknowledged. 

 

I interviewed more than 15 of my co-workers at Sama, who provided insights and personal experiences of what it’s like to be a content moderator, what it takes from you, what challenges are faced, the impacts of the job, and their overall experience and stories. We all shared similar work experiences regarding the toxic content we consumed daily and the poor working conditions. Beyond that, we shared food, photos, friendships, as well as good and bad times. Most importantly, we are all dealing with the consequences of this job. We are all struggling with our own battles, be they anxiety, sleeping disorders, eating disorders, drug addiction, PTSD, or several other mental health problems that result from being exposed to harmful content day in and day out. 

My background in journalism and social studies was very helpful while conducting this investigation, as were my previous experiences working for the Ethiopian Herald, where I’ve covered hundreds of news stories and articles for five years. The writing itself came easy for me as I’ve had experience conducting interviews and compiling stories. However, most of the stories were heartbreaking and unpleasant to hear, and reading and editing them was reliving the traumatic experiences over and over again. Having to face my own emotional and psychological torment from PTSD was extremely challenging, but also gave me hope for healing and returning to my normal state of mind, as writing out what I had been keeping in for so long was a therapeutic tool. I tried to include everyone’s story and convey a collective message that represents all of us ex-content moderators with different cultures, languages, and lifestyles who are united in our diversity! Our experiences make us one. 

The key demands that we distill from this report are policy changes regarding content moderation and data work. No human should suffer from mental health issues in order to train AI tools. Tech giants like Meta should be held accountable for the mental and psychological

Recommended citation:

Gebrekidan, F. B. 2024, Content moderation: The harrowing, traumatizing job that left many African data workers with mental health issues and drug dependency. In: M. Miceli, A. Dinika, K. Kauffman, C. Salim Wagner, and L. Sachenbacher (eds.) The Data Workers‘ Inquiry. Creative Commons BY 4.0. https://data-workers.org/fasica 

About the Author

Fasica Berhane Gebrekidan

For two years, Fasica worked as a Content Moderator for Meta/Facebook via Sama in Nairobi until she was unlawfully laid off for attempting to form a data workers’ union. She has firsthand experience moderating in regions and communities impacted by conflict and war. Fasica has also worked as a co-researcher for the Distributed AI Research Institute, DAIR. She has a BA in Journalism and Communications from Mekelle University in Ethiopia and five years of experience as a Senior Reporter at The Ethiopian Herald Daily English newspaper, covering gender equality, women’s rights, disability issues, youth, and social matters. Fasica loves to travel, read, and have deep conversations with people around her. In her leisure time, she likes to paint and write poetry.

Skip to content