Meta’s East African content moderation hub in Nairobi shuts down as its third-party servicer calls it quits

Meta’s East African content moderation hub in Nairobi shuts down as its third-party servicer calls it quits

0

Meta’s East African content moderation hub is shutting down as the social media giant’s third-party contractor moves away from policing harmful content, cutting around 200 staff and leaving several employees without work permits.

The owner of Facebook, WhatsApp and Instagram first contracted Sama in 2017 to assist with labelling data and training its artificial intelligence, hiring around 1,500 employees. But within two years, the Nairobi office was moderating some of the most graphic and harmful material on Meta’s platforms, including beheadings and child abuse.

Sama – a third party contractor for Meta – is shutting down parts of its business including content moderation for the social media giant from March 2023, it announced on Tuesday. It said it is letting go of around 3 per cent of its staff – roughly 200 employees – at its Nairobi office and has had to make the decision because of the “current economic climate.”

Sama staff were told on Tuesday morning that the company would focus solely on labelling work – also known as computer vision data annotation, which includes positioning animations in augmented reality filters, such as bunny ears.

“The current economic climate requires more efficient and streamlined business operations,” Sama said in a statement encouraging employees to apply for vacancies at its offices in Kenya or Uganda. Some Sama staff rely on work permits to remain in the region.

Sama was Meta’s key content moderation partner in East Africa that was responsible for policing harmful content like child abuse and beheadings. It is now discontinuing its work for the tech giant.

The firm said it is offering support packages for employees affected and encouraging those eligible to apply for work at its other offices in Kenya and Uganda. Staff who have been let go will receive severance packages as well as wellbeing support for up to 12 months after their last day of working at the company.

“We respect Sama’s decision to exit the content review services it provides to social media platforms,” a Meta spokesperson said in a statement to Insider. “We’ll work with our partners during this transition to ensure there’s no impact on our ability to review content.”

Meta first contracted Sama in 2017 for data labelling and training artificial intelligence, hiring some 1,500 employees, the Financial Times reported.

The company has faced allegations of poor treatment of employees and in 2022 a lawsuit was filed by a fired content moderator against Meta and Sama alleging human trafficking, forced labour and union busting.

A former employee, Daniel Motaung, said he and his colleagues at Sama were exposed to graphic content including child sexual abuse, animal torture and beheadings and were paid a salary of $2.20 per-hour. Some of his peers developed post-traumatic stress disorder (PTSD) as a result of the experience, the lawsuit alleged.

Majorel – a Luxembourg-based firm – will be taking over content moderation for Meta in the region, sources with knowledge of the matter told the Financial Times. Majorel has worked on content moderation for TikTok previously.

Majorel has also faced allegations from employees of poor working conditions. In August 2022, Insider reported on current and former TikTok moderators employed through Majorel who claimed they experienced severe psychological distress.

Some workers also alleged that Majorel created an environment of near-constant surveillance and unrealistic metric goals.

  • A Tell report
About author

Your email address will not be published. Required fields are marked *