Facebook's core design sabotaged the social media giant's efforts to combat misinformation running rife on the platform, scientists analysing its misinformation policies said.
The platform's architecture pushed back even when Facebook tweaked its algorithms and removed content and accounts to combat vaccine misinformation, the researchers at the George Washington University, US, found.
No reduced engagement with anti-vaccine content was seen, despite Facebook's significant effort to remove a lot of such content during the COVID-19 pandemic, their study published in the journal Science Advances said.
The scientists say that these consequences resulted from what the platform is designed to do - enabling community members to connect over common interests, which include both pro- and anti-vaccine persuasions.
"(Facebook) is designed to allow motivated people to build communities and easily exchange information around any topic," said David Broniatowski, lead study author and an associate professor of engineering management and systems engineering.
"Individuals highly motivated to find and share anti-vaccine content are just using the system the way it's designed to be used, which makes it hard to balance those behaviours against public health or other public safety concerns," said Broniatowski.
In the remaining anti-vaccine content not removed from the social media, links to off-platform, low credibility sites and "alternative" social media platforms increased in number, the researchers said.
This remaining content also became more misinformative, containing sensationalist false claims about vaccine side effects which were often too new to be fact-checked in real time, they found.
Further, anti-vaccine content producers were found
Read more on tech.hindustantimes.com