Study: Facebook's System Let Users to Spread Misinformation - The Messenger
It's time to break the news.The Messenger's slogan

Study Finds the Design of Facebook Allowed Covid-19 Vaccine Misinformation to Spread Despite Removals

Researchers found that even if Facebook removed troublesome content in one place, users could find it easily elsewhere on the platform.

Engagement with covid-19 misinformation on Facebook increased after the company took down posts.Andriy Onufriyenko / Getty Images

Years of efforts to stop the spread of Covid-19 vaccine misinformation on Facebook have had mixed results, according to a new study, and the problem may be caused by the design of the platform itself. 

Researchers found that although Facebook removed anti-vaccine content that violated its policies, engagement with the content in dedicated anti-vaccine pages and groups remained at the same level and in some cases increased. Posts about misinformative topics also increased, as did links to low credibility sources that espoused misinformation. 

How could engagement with anti-vaccine content in pages and groups increase if Facebook removed posts? According to researchers, the answer lies with Facebook’s system architecture, the structure that determines how information flows and reaches users through the platform.

Facebook is more than just content and algorithms, said David Broniatowski, an associate professor at George Washington University and the lead author of the study. At its core, it has a system that was designed to build communities and help people find information that’s interesting to their community.

“Even if you cut out this piece of information here, that piece of information there, and even if you remove large numbers of pages, if there are other pages that are out there that are spreading essentially the same information, then people will just go to those other pages,” Broniatowski stated. “The system makes it easy to find them.”

Facebook’s system structure consists of a three-layered hierarchy. The top layer is made up of pages, which are broadcast platforms that were originally meant for brands and celebrities, but are now used for all sorts of topics, such as vaccine refusal. The middle layer consists of groups, or areas when people can post and discuss common interests. The bottom layer represents users, who may see anti-vaccine content in their newsfeed.

By design, this system is flexible and allows users to find information through several alternative paths, the study states, even if some paths are removed. For instance, Facebook’s hierarchy allows groups and pages to coordinate with each other and post the same anti-vaccine content in multiple places at the same time. That means that even if Facebook removed the content in a particular group or page, the content could still be found in other places.

Pages and groups are good things, Broniatowski said, because they help people get their message out and drive innovation.

“Unfortunately, those exact same tools are being used by anti-vaccine content producers in order to get their message out as well,” he explained.

Broniatowski hopes his group’s research can help platforms and decision makers develop new approaches to deal with misinformation. He suggests that one to think about it is to compare it to architecture, an industry where designers must follow codes to ensure public health and safety. Applying a similar solution to social media platforms, where all platforms follow codes to reduce online harm, is an idea worth exploring, Broniatowski said.

Businesswith Ben White
Sign up for The Messenger’s free, must-read business newsletter, with exclusive reporting and expert analysis from Chief Wall Street Correspondent Ben White.
 
By signing up, you agree to our privacy policy and terms of use.
Thanks for signing up!
You are now signed up for our Business newsletter.