LISTEN TO ARTICLE
SHARE THIS ARTICLE
The closing of polls in the U.S. will inevitably herald the start of postmortem investigations into the roles Facebook, Twitter, and Alphabet’s YouTube played in the vote. The problem with a postmortem is that, by definition, it suggests the issue is dead. But while the U.S. presidential election might be over for another four years, a dozen other national elections—from Belize to Myanmar to Nigeria to Romania—will take place this year.
And disinformation, propaganda, and fake news are as big a problem elsewhere as they are in the U.S. In some places it’s even more severe. Facebook Inc. subsidiary WhatsApp—which is end-to-end encrypted, making it impossible for moderators to see the content of messages—prevails in countries such as Brazil and India.
Although the digital advertising giant has added thousands of employees to vet disinformation and harmful content, they often work in regional hubs. So a moderator responsible for content in Kenya might well be sitting in front of a computer screen in Dublin or even Florida. On-the-ground knowledge can be important. Even a native Swahili speaker might struggle to understand some subtleties. It’s trickier still for an algorithm. Take code switching, for example—when the conversation migrates from one medium, such as a political rally, to another, such as Twitter. Kenyan political discourse tends to lean heavily on proverbs, according to the Nairobi-based analyst Nanjira Sambuli. “ ‘Nobody can stop reggae’ is a popular one here,” she says. “But it means different things to different audiences.” The original song lyric is about resilience, but it “could mean continue going to the streets irrespective of Covid,” she says. It would be hard for an automated system or a moderator in Dublin to grasp that nuance.
In September, BuzzFeed News obtained an internal memo written by former Facebook data scientist Sophie Zhang. The 6,600-word missive reportedly outlined numerous occasions where the company, based in Menlo Park, Calif., had either been slow or failed to respond to evidence of coordinated campaigns using bots and fake accounts to influence elections and public opinion in Azerbaijan, Brazil, Honduras, Spain, Ukraine, and many other countries. It often ignored problems that arose outside the U.S. and Western Europe, regions it deemed a priority, BuzzFeed said. Facebook would generally only tackle an issue after it had become the focus of a negative media storm.
Engaging with the problem of disinformation properly on a global level has a corollary benefit: Rather than treating it as an issue that rears its head every election cycle, the technology platforms can take what they’ve learned in another part of the world and apply it at home. Public opinion is shaped and radicalized over years. The election may have ended, but the problem isn’t going away.
Webb is the European tech columnist for Bloomberg Opinion.
Read next: Facebook Needs Trump Even More Than Trump Needs Facebook
Source: Read Full Article