Big Tech Manipulating Research Into Its Harm To Society


(MENAFN- Asia Times) For almost a decade, researchers have been gathering evidence that the social media platform facebook disproportionately amplifies low-quality content and misinformation .

So it was something of a surprise when in 2023 the journal Science published a study that found Facebook's algorithms were not major drivers of misinformation during the 2020 United States election.

This study was funded by Facebook's parent company, Meta. Several Meta employees were also part of the authorship team. It attracted extensive media coverage . It was also celebrated by Meta's president of global affairs, Nick Clegg , who said it showed the company's algorithms have“no detectable impact on polarisation, political attitudes or beliefs.”

But the findings have recently been thrown into doubt by a team of researchers led by Chhandak Bagch from the University of Massachusetts Amherst. In an eLetter also published in Science , they argue the results were likely due to Facebook tinkering with the algorithm while the study was being conducted.

In a response eLetter , the authors of the original study acknowledge their results“might have been different” if Facebook had changed its algorithm in a different way. But they insist their results still hold true.

The whole debacle highlights the problems caused by Big Tech funding and facilitating research into their own products. It also highlights the crucial need for greater independent oversight of social media platforms.

MENAFN04102024000159011032ID1108746000


Asia Times

Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.