Mark Zuckerberg announced last week yet another change to the Facebook newsfeed. Following a contentious year that embroiled the platform in controversy, Facebook intends to give preferential treatment to news sites based on users’ feedback as to which providers are most trusted.
From Zuckerberg’s post,
“The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking.”
Who those users are, how they are selected and exactly how “trust” is measured remains to be revealed. News and media professionals don’t appear to have a voice in determining the authority and credibility of news sites.
That’s problematic. In the past, Facebook demonstrated clear vulnerabilities when relying on its community. In mid-2016, when Facebook fired the editors curating its Trending module to instead rely on its algorithm and user engagement around stories, the community proved itself to not be the most reliable arbiter of legitimate news. False stories from dubious sources, such as a false report indicating Megyn Kelly had been fired from Fox News for endorsing Hillary Clinton for president, immediately rose to the top. Facebook later changed Trending again to try to tackle those issues.
So far, Facebook’s attempts to police its own platform have had little impact on the mitigation of disinformation and “fake news.” The platform itself reported that over 126 million Americans saw Russian disinformation leading up to 2016 election emanating from the community. Furthermore, independent fact-checkers brought in by Facebook to flag fake stories have said efforts to stem the tide of disinformation are falling short.
Outside of Facebook’s walls, trust is a contract between the audience, who gives an investment of time and the publishers’ ability to match that with quality journalism. Handing all of that power to the “community” creates dangerous opportunities for propagandists and purveyors of fake news to exploit the platform to further their own agendas. During the French elections, special interests organized on platforms like Discord to orchestrate social media events on Facebook and Twitter. More recently, following a November 2017 mass shooting at a church in Sutherland Springs, Texas, a false story spread across Facebook saying Antifa terrorists were the perpetrators.
At Storyful, we spent the last two years mapping and understanding the pathways that “fake news” travels. Our work makes it clear that Facebook is a well trod avenue for disseminating dubious information from private or semi-private platforms and communities to the masses. Following the tragic events in Las Vegas last year, we detailed false claims made by questionable entities on Facebook. In the UK, we highlighted the efforts of a special interest group to affect elections and advance an agenda. And, on our podcast, we discussed the impact of social media and disinformation in India.
What happens in the following weeks and months may have very serious implications for the news industry and the world. Upcoming elections in Eastern Europe, Brazil, Pakistan, Cambodia and the United States (among others) are prime opportunities for those who seek to spread disinformation via an increasingly siloed social media population who are most likely to trust sources they agree with.
Users the world over flock to Facebook to discuss happenings big and small, local and global, factual and fictional. Left alone, these would be the very same users that would assess the value and reach of stories generated by newsrooms that endeavor every day to report facts and vital information.
We at Storyful will watch for any further developments on these changes and hope industry experts will have a seat at the table to influence the fate of news on Facebook.
[This post was originally published on Storyful’s blog]
Facebook comments can’t guarantee a lack of anonymity
On August 19, 2011
In Community Engagement, Facebook, Industry News & Notes
There’s a conventional wisdom out there in the online journalism world that: 1.) News site comments will automatically be better if people have to use real names, and 2.) Using Facebook for your comments will accomplish this.
I’ve said many times before that I don’t think anonymity is the problem. My campaign on that seems to be a lost cause so far. As a former comment moderator and current manager of social media accounts, I know for a fact that people have absolutely no problem spouting hateful views and violent rhetoric under their real name. I see it every day.
Aside from that, there’s also all kinds of evidence that Facebook comments aren’t the end-all, be-all answer on this front.
As my friend Jeff Sonderman recently wrote at Poynter, Facebook comments can be a boon to news sites in lots of ways: Increased Facebook traffic referrals, fast page load times, an easy out-of-the-box comment solution.
One thing Facebook doesn’t do, however, is prevent anonymity (as the same article and several others insist).
While there is a rule on Facebook that one has to use their real name, it’s not always followed. I have several Facebook friends who use false names for various personal reasons – and they are all, essentially, anonymous. That said, they are still identifiable to their friends, which still keeps some people in check with their online comments. (Though this certainly doesn’t apply to everyone.)
The biggest threat to the alleged transparency and decency of Facebook-powered commenting lies in the same tool many news organizations use to communicate with readers: Facebook Pages.
Just speaking anecdotally here (if you have stats to back me up, please help), I’ve seen an uptick of abusive posts and trolling on Facebook ever since it rolled out its new pages in February. That rollout included the new ability to use Facebook as a page.
This change made it possible for just anyone to set up a fake character on Facebook – and then use Facebook as that character. On The Huffington Posts’s pages (on which I am an administrator) and the pages of other groups and news organizations, I’ve seen these fake accounts spreading spam, trolling the page’s regular users and making hateful statements under the guise of a made-up character.
Here’s a view examples of some alias accounts I found on news pages (or skip below if you want):
Even before this change, there was a history of false profiles spamming and trolling Facebook, the addition of Use-As-Page to the toolbox only gave trolls a new way to stay in business. Facebook has staff that deals with those accounts when they are found or reported, but it certainly can’t be easy for them to keep up with people who are dead set on being trolly.
(Related aside: When I worked as a comment moderator for the Cincinnati Enquirer, a troublesome site user with many usernames emailed me to say, “I’m retired and have nothing else to do but create new accounts every time you block me. I can make your live miserable.” This is just a sampling of the mentality of trolls, folks. Here’s another.)
Now, I’ve got no doubt that some news sites have seen higher quality discussion after installing Facebook commenting; it’s definitely better than many of in-house or other out-of-the-box solutions I’ve seen on news websites. It likely is the best option for those sites that don’t have the technical expertise and manpower to host and manage a heavy flow of onsite comments day in and day out – so long as they don’t mind handing a big part of their community to Facebook.
I’m just warning that news sites shouldn’t assume that Facebook on its own will solve their commenting problems. Users can and will still be anonymous (or even identifiable) hateful trolls. To make it work, you still need a daily moderation workflow and a newsroom-wide commitment to not only reading story/blog comments, but responding to them.