Mark Zuckerberg announced last week yet another change to the Facebook newsfeed. Following a contentious year that embroiled the platform in controversy, Facebook intends to give preferential treatment to news sites based on users’ feedback as to which providers are most trusted.

From Zuckerberg’s post,

The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking.”

Who those users are, how they are selected and exactly how “trust” is measured remains to be revealed. News and media professionals don’t appear to have a voice in determining the authority and credibility of news sites.

That’s problematic. In the past, Facebook demonstrated clear vulnerabilities when relying on its community. In mid-2016, when Facebook fired the editors curating its Trending module to instead rely on its algorithm and user engagement around stories, the community proved itself to not be the most reliable arbiter of legitimate news. False stories from dubious sources, such as a false report indicating Megyn Kelly had been fired from Fox News for endorsing Hillary Clinton for president, immediately rose to the top. Facebook later changed Trending again to try to tackle those issues.

So far, Facebook’s attempts to police its own platform have had little impact on the mitigation of disinformation and “fake news.” The platform itself reported that over 126 million Americans saw Russian disinformation leading up to 2016 election emanating from the community. Furthermore, independent fact-checkers brought in by Facebook to flag fake stories have said efforts to stem the tide of disinformation are falling short.

Outside of Facebook’s walls, trust is a contract between the audience, who gives an investment of time and the publishers’ ability to match that with quality journalism. Handing all of that power to the “community” creates dangerous opportunities for propagandists and purveyors of fake news to exploit the platform to further their own agendas.  During the French elections, special interests organized on platforms like Discord to orchestrate social media events on Facebook and Twitter. More recently, following a November 2017 mass shooting at a church in Sutherland Springs, Texas, a false story spread across Facebook saying Antifa terrorists were the perpetrators.

At Storyful, we spent the last two years mapping and understanding the pathways that “fake news” travels. Our work makes it clear that Facebook is a well trod avenue for disseminating dubious information from private or semi-private platforms and communities to the masses. Following the tragic events in Las Vegas last year, we detailed false claims made by questionable entities on Facebook. In the UK, we highlighted the efforts of a special interest group to affect elections and advance an agenda.  And, on our podcast, we discussed the impact of social media and disinformation in India.

What happens in the following weeks and months may have very serious implications for the news industry and the world. Upcoming elections in Eastern Europe, Brazil, Pakistan, Cambodia and the United States (among others) are prime opportunities for those who seek to spread disinformation via an increasingly siloed social media population who are most likely to trust sources they agree with.

Users the world over flock to Facebook to discuss happenings big and small, local and global, factual and fictional. Left alone, these would be the very same users that would assess the value and reach of stories generated by newsrooms that endeavor every day to report facts and vital information.

We at Storyful will watch for any further developments on these changes and hope industry experts will have a seat at the table to influence the fate of news on Facebook.

[This post was originally published on Storyful’s blog]