Meta forms a crew to keep an eye on posts on Facebook and Instagram

A cross-functional team has been established by Meta, the company that owns Facebook,

Instagram, and Whatsapp, to monitor posts made by Nigerians before, during, and after the country's elections in 2023, particularly on Facebook and Instagram. 

The team, according to Meta, was assembled to combat hate speech, disinformation, and fake news throughout the election season. 

It emphasized a number of other actions it has taken to safeguard the credibility of the Nigerian elections. 

Team members' backgrounds: Meta stated that the team is made up largely of Nigerians and others who have lived there for a while without going into specifics.

People with international expertise in disinformation, hate speech, elections, and misinformation are also on the team. Prior to, during, and following Nigeria's general elections in 2023, these teams are working hard to avoid any exploitation of our services. We also employ local Nigerians to work in public policy, public policy programs, and communications," the statement continued. 

Why the change? Adaora Ikenze, Head of Public Policy for Anglophone West Africa at Meta, said during a press conference on Wednesday in Lagos that the company's strategy for the upcoming elections was influenced by its experience with previous elections in Sub-Saharan Africa and its discussions with human rights organizations, NGOs, local civil society organizations, regional experts, and local election authorities.

When it comes to assisting in keeping people secure during the elections, we are aware that we have a significant obligation. We've made significant investments in people and technology using past experiences as well as advice from experts and policymakers from across the national spectrum to reduce misinformation, remove harmful content from our platforms, combat voter interference, and encourage civic engagement during the elections. To make sure we're ready for the unique issues in Nigeria and taking the proper action to keep ahead of emerging threats, we continue to engage closely with local partners in Nigeria and election authorities, she said.

She also revealed that since 2016, Meta has invested more than $16 billion in teams and technology in this area, resulting in a quadrupling of the number of the global teams working on safety and security to roughly 40,000 individuals. Additionally, there are approximately 15,000 content reviewers who are dispersed throughout the world in all the main timezones. 

These reviewers, according to her, are able to evaluate information in more than 70 languages, including Hausa, Igbo, and Yoruba. 

WhatsApp misinformation: Ikenze claimed that while many people use WhatsApp extensively to distribute false information through forwarded messages, Meta has implemented steps to stop the spread of contagious falsehoods. Any mail that has been forwarded once can no longer be forwarded to more than one group at once, according to her.

The number of highly forwarded messages sent on WhatsApp decreased by almost 70% after we implemented the same feature. We also mark communications as "forwarded" or "very forwarded" to draw attention to sharings that have occurred frequently, she added.


Popular posts from this blog