In Nairobi, Kenya, over 1,000 workers are being laid off by Sama, a company previously contracted by Meta, Facebook’s parent company, for content moderation services. This follows a lawsuit filed by former moderators who accused Sama and Meta of poor working conditions, including low pay and inadequate mental health support. The moderators, who worked at a hub in Nairobi, were responsible for reviewing distressing content from across Africa. Sama has since shifted its focus to AI data labeling and is no longer providing moderation services to Meta. The lawsuit, seeking $1.6 billion in compensation, is ongoing. Meta insists its contractors must pay above industry standards and provide mental health support. This situation highlights the challenges faced by content moderators and the responsibilities of tech companies in ensuring fair working conditions.
QUESTION: How might the experiences of content moderators in Kenya influence the way social media companies handle harmful content in the future?
