A common complaint from Facebook employees is that there is no centralized vision of what kind of experience users should have, as directed by the platform’s algorithms. A member of the soon-to-be-departed Integrity team wrote in a farewell note that “harms the mismatched interaction” between different parts of the platform. What Haugen calls the company’s “flat” corporate structure makes it difficult to implement recommended interventions to tackle harmful content. For example, an internal study showed that downgrading “deep retweets” from people who weren’t friends or followers of the original poster could reduce so-called disinformation by 25%. false information about citizens is viewed and the number of views of false information about citizen photos is equal to 50%. Haugen said the intervention was discussed with senior management but never implemented, in part because Facebook didn’t want to lose reader engagement due to deep retweets. Joe Osborne, a Facebook spokesman, said that sometimes the company drops rets deeply, but that’s rare, because it’s a tool that bluntly affects benign speech coupled with misinformation. deviated.
https://www.washingtonpost.com/business/how-facebook-algorithms-can-fight-over-your-feed/2021/11/29/36b6d092-517f-11ec-83d2-d9dab0e23b7e_story.html?utm_source=rss&utm_medium=referral&utm_campaign=wp_business How the Facebook Algorithms Can Scramble for Your Feed