Data Workers' Inquiry

The Emotional Labor Behind AI Intimacy

My name is Michael Geoffrey Asia, and I wrote this testimony to tell the story of workers like me who found ourselves trapped in the hidden corners of the AI industry, where human emotion becomes data.

by Michael Geoffrey Asia

Cite this work as:

Asia, M. G. (2025). The Quiet Cost of Emotional Labor. In: M. Miceli, A. Dinika, K. Kauffman, C. Salim Wagner, and L. Sachenbacher (eds.). Data Workers‘ Inquiry. Creative Commons BY 4.0. https://data-workers.org/michael/

This piece can be used, shared, and adapted with proper attribution.

Trigger Warning

This report includes depictions of stressful working conditions, mental health burdens, and structural injustices that may be distressing for some readers. Viewer discretion is advised.
Please proceed with care

After training as an Air Cargo Agent at Nairobi Aviation College, I dreamed of working in aviation logistics. But after years of unemployment, I joined Sama in Nairobi, labeling data for Meta. That was my entry into the global AI supply chain. From there, I stumbled into another kind of digital work,  chat moderation, a job advertised as simple online messaging but which turned out to be something else entirely.

Chat moderators are hired by companies such as Texting Factory, Cloudworkers, and New Media Services to impersonate fabricated identities, often romantic or sexual, and chat with paying users who believe they’re forming genuine connections. The goal is to keep users engaged, meet message quotas, and never reveal who you really are. It’s work that demands constant emotional performance: pretending to be someone you’re not, feeling what you don’t feel, and expressing affection you don’t mean.

Over time, I began to suspect that I wasn’t just chatting with lonely users. I was also helping to train AI companions, systems designed to simulate love, empathy, and intimacy. Many of us believed we were simultaneously impersonating chatbots and teaching them how to replace us. Every joke, confession, and “I love you” became data to refine the next generation of conversational AI.

This project sheds light on a workforce that remains invisible yet essential, the people whose emotions fuel algorithms that pretend to feel. It is a call for recognition, dignity, and transparency in an industry that profits from the pretense of connection while erasing the humans behind it.

About the Author

Michael Geoffrey Asia

Michael Geoffrey Abuyabo Asia is a researcher and labor advocate whose work centers the emotional labor, psychological strain, and hidden human expertise behind chat moderation and AI training. He has firsthand experience across multiple global outsourcing platforms—working on Meta projects under Sama and holding roles at CloudFactory, TELUS International, TransPerfect DataForce, Appen, and NMS Philippines. His background includes impersonating and training AI chat companions, giving him rare insight into one of the most opaque and rapidly expanding forms of digital labor.

As Secretary General of the Data Labelers Association (DLA), Michael leads national efforts to secure fair wages, mental health protections, ethical working conditions, and skills development for data workers. His research interrogates the rise of AI impersonation in chat roles and its consequences for worker dignity, transparency, and accountability. Through both inquiry and