Meta’s fact-checking pivot may fuel disinformation and threaten hundreds of jobs in Africa 

The decision by Meta to transition from fact-checking to community notes on Instagram, Facebook, and Threads will have a significant impact on content moderation companies and business outsourcing firms. This change could potentially lead to job losses for many contractors in African countries and may hinder the fight against misinformation, according to civil rights activists and media professionals.

Misinformation is a pressing issue in Africa, particularly in countries like Kenya where platforms like WhatsApp, Facebook, and Instagram have millions of users exposed to unchecked manipulative content. With elections upcoming in various African nations, the absence of effective fact-checking mechanisms could exacerbate the spread of disinformation.

The move away from the current moderation model was announced by Meta in response to criticism that the fact-checking program was being used to censor information. This shift could impact African content moderators and fact-checking firms financially, potentially leading to funding challenges for organizations dedicated to combating misinformation.

The implementation of community notes will alter Meta’s financial relationships with moderation partners and could result in the loss of jobs for trained moderators. This shift may also limit the ability of organizations like PesaCheck to address harmful content and protect public discourse.

While Meta’s changes are currently limited to the U.S., the impact of these alterations in content moderation could have far-reaching consequences, especially in regions like Africa where misinformation poses a significant threat.