African tech workers press global social media giants for better conditions

|
Carlos Mureithi
Mophat Okinyi, once a content moderator for ChatGPT, is seeking to unionize his former colleagues to improve their pay and conditions.
  • Quick Read
  • Deep Read ( 5 Min. )

As a quality analyst for ChatGPT, Mophat Okinyi read hundreds of descriptions of pedophilia and incest every day, in order to filter them from the artificial intelligence website. He earned $1.50 an hour.

This work left him so traumatized that he eventually separated from his wife. “If you put so much dirty content in your mind, it changes you,” he says.

Why We Wrote This

A story focused on

Tech giants like Facebook and YouTube have been accused of taking advantage of weaker labor laws in Africa. Now content moderators are using legal avenues to win fair pay and support.

Mr. Okinyi is among 200 Kenyan content moderators working for tech powerhouses, including Facebook and TikTok, who hope to form a union. Their key goal is to force tech corporations to provide adequate mental health care and fair pay for everyone who works for them – including employees like them, who are outsourced in countries like Kenya that supply quality labor at cheap prices. 

Tech giants have long been criticized by workers undertaking this grueling job. Facebook’s parent company, Meta, faces lawsuits brought by content moderators in Kenya accusing it of union busting and insufficient psychological support, among other infringements. Meta is currently appealing a decision that it is the content moderators’ “true employer” and therefore liable. 

“If Facebook is held the true employer of these workers, then the days of hiding behind outsourcing to avoid responsibility for your critical safety workers are over,” says Cori Crider of FoxGlove, a London-based legal nonprofit that’s supporting the moderators in one of the cases.

Mophat Okinyi hates remembering the job he used to do for ChatGPT.

For about $1.50 an hour, he read hundreds of descriptions of pedophilia and incest for the  artificial intelligence platform every day. As a quality analyst, his job was to confirm that his subordinates had read and classified potentially harmful content correctly.

"Some of these things are very shocking," says Mr. Okinyi. "We shouldn’t even talk about some of these texts."

Why We Wrote This

A story focused on

Tech giants like Facebook and YouTube have been accused of taking advantage of weaker labor laws in Africa. Now content moderators are using legal avenues to win fair pay and support.

The work left him so traumatized, he says, that he drifted apart from his family and eventually separated from his wife. “If you put so much dirty content in your mind, it changes you,” he says.

Now, Mr. Okinyi and some 150 other content moderators working for tech powerhouses, including Facebook and TikTok, hope to form a union to improve their pay and working conditions.

“We're trying to make this job safe for those who will do it in future and those who are doing it right now,” says Mr. Okinyi.

Their decision to unionize shines a spotlight on the way tech giants use human labor in Africa where, through outsourcing, they hire hundreds of people to remove harmful content from their platforms. African content moderators hope to force tech corporations to provide adequate mental health care and fair pay for everyone who works for them – including non-traditional employees such as themselves. 

Courtesy of Foxglove
Current and former African content moderators working for global tech companies vote to form a trade union, May 1, 2023.

“Unionization signals that gig work rights are labor rights, and workers deserve the protections provided by law in this field,” says Nanjira Sambuli, a Nairobi-based tech and international affairs fellow at the Carnegie Endowment for International Peace.

Since last year, Facebook’s parent company, Meta, has been facing lawsuits brought by content moderators in Kenya accusing it of union busting, wrongful terminations and insufficient psychological support, among other infringements.  In one case, Meta claimed it was not the moderators’ employer, and was therefore not liable. The court ruled Meta was the “true employer” and the “owner of the digital work of content moderation.” 

“That’s the most significant labour rights decision about content moderation I have seen from any court anywhere,” says Cori Crider, co-founder and director of Foxglove, a London-based non-profit that’s providing legal advice to the moderators. “If Facebook is held the true employer of these workers, then the days of hiding behind outsourcing to avoid responsibility for your critical safety workers are over.”

But the decision isn’t set in stone yet, as Meta awaits the outcome of an appeal.

Outsourcing responsibility 

For years, tech platforms have faced intense criticism around the world for failing to filter divisive content. In Africa, Facebook came under fire last year for alleged inaction over hateful material that eventually incited violence during the war in northern Ethiopia. A study last month by Global Witness found extreme and hate-filled ads were approved by YouTube, Facebook and TikTok in South Africa, where xenophobic violence has flared up in recent years.

As a result, tech powerhouses have invested heavily in removing material including hate speech, misinformation and incitement to violence from their platforms. Many of the workers who undertake this vital but gruelling task are hired through outsourcing companies and are based in countries like Kenya, India and the Philippines, which supply quality labor at cheap prices.

In Nairobi, a regional tech hub, outsourcing companies bring talent from numerous African countries to moderate work in different African languages. They include Sama, a San Francisco-headquartered company, which has contracted workers for Facebook and ChatGPT. Majorel, headquartered in Luxembourg, hires labor for Facebook and TikTok.

Courtesy of Foxglove
Signs and sticky notes, at the meeting when current and former content moderators voted to form a union, suggest names for the new organization.

Global tech companies believe that by outsourcing, they can escape responsibility, says Odanga Madung, a senior researcher at the  Mozilla Foundation in Nairobi, whose work focuses on the impact of tech platforms in Africa.

“Irresponsibility has always been good business in the capitalist contexts," he says. “Taking care of people is expensive, more so if you're exposing them to graphic content on behalf of your users.”

Accusations of exploitation of content moderators are not unique to Africa. Moderators in the US and Ireland have in the past sued Facebook for mental health issues related to their work. In Germany, a Berlin-based trade union called Verdi has recently been helping content moderators for TikTok and Facebook to unionize. 

But the move by African workers to form a union is a novel approach outside the West. At the top of members’ list is having regular, professional mental health checkups, and having their pay standardized with those of their peers across the world.

After graduating from university, Mr. Okinyi joined Sama in Nairobi in 2019 for his first job. He worked on various projects for different foreign tech companies, doing data labeling, product classification and other tasks. But it was his content moderation work for ChatGPT, starting in 2021, that affected him in unforeseen ways.

ChatGPT is a chatbot that takes in a user’s question then, using a language model created from words from the web, provides an answer. Critics say its reliance on mining the internet makes it vulnerable to toxic material.

For the six months that he worked on ChatGPT, Mr. Okinyi’s work began early, ended late, and left him emotionally drained. 

Every day at work, he read some 700 texts about child sexual abuse and flagged them according to their severity. Over eight-hour shifts of reading and labeling this material, he enabled ChatGPT to filter out harmful requests.

Although Sama provided counselors, Mr. Okinyi says, productivity demands at work meant he and other workers barely had time to see them.

OpenAI, ChatGPT’s developer, didn’t respond to a request for comment.

Carlos Mureithi
Kauna Ibrahim, a former content moderator for Facebook, poses for a photo in Nairobi, Kenya, June 13, 2023.

The situation was equally appalling for Facebook content moderators at Sama. Kauna Ibrahim, a Nigerian, spent four years watching hundreds of horrific videos every day at work, including sexual abuse and beheadings. For roughly three dollars an hour, she assessed whether the videos were in violation of Facebook’s policies.

During her first year of work, she began suffering panic attacks.

“Some of the images never leave you. You find yourself unable to sleep. Sometimes you dream of what you have seen,” says Ms. Ibrahim, who was a graduate student in clinical psychology at the time. “But because you do it every day, you just survive.”

Sama’s therapists weren’t qualified and didn’t provide enough psychological support, Ms. Ibrahim says. So she resorted to seeking her own therapist.

Sama says it provides “qualified and licensed” professionals to provide therapy for its workers, and that it uses an “internationally-recognized” methodology to set wages for its workers, making its pay “internationally comparable and locally specific.”

Meta declined to comment because of the ongoing lawsuits.

Ms. Ibrahim was among 260 workers whose contracts were terminated in March 2023 after Sama stopped doing work for Facebook. Sama’s work for ChatGPT ended in March 2022, and Mr. Okinyi later moved to Majoral to do customer service work for a European e-commerce company. 

On Labor Day this year, both Mr. Okinyi and Ms. Ibrahim sat alongside about 150 other content moderators for Facebook, TikTok and ChatGPT in a Nairobi hotel, and voted to unionize. Mr. Okinyi understands the fight may not be over, but he’s willing to keep pushing with his colleagues to ensure their voices are heard.

“We want to be united because if we're united, we become strong,” he says.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to African tech workers press global social media giants for better conditions
Read this article in
https://www.csmonitor.com/World/Africa/2023/0712/African-tech-workers-press-global-social-media-giants-for-better-conditions
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe