Dispatches from the living amongst journalism's walking dead

Tag: community management

How to Maintain a Safe, Positive and Public Facebook Life

So you’ve turned on Facebook Subscribe. Now what? Here’s some suggestions from someone who’s been doing it awhile. What would you add? Leave suggestions in the comments.

Set up friends lists to help direct posts.

Click on ‘Friends’ on the left side of your profile. Here you can sort, search and assign friends into lists of your choosing. Take the time to create lists based on the sort of things you share. Maybe you have a list for family and friends to show off photos of your kids/pets/self. Maybe you have one just for coworkers or work-related purposes.

Be selective about who you share with.

You can direct individual status updates, photos, videos, notes and galleries to very granular groups (based on those friends lists you made). Your subscribers likely don’t care about your dinner plans with friends, so maybe those sort of updates should be directed to friends only. Also take the time consider the privacy of those you tag or feature in posts or images, they may not want to be exposed to your public audience.

Be smart.Don’t share where you live or details about your schedule on public posts. And ladies, consider what your public posts say to the sexual harassers, stalkers and all-around creeps who hang out on Facebook. I’ve encountered some real weirdos who’ll respond in an uncomfortable fashion to just about any post – I try not to encourage them.

Manage your comments.If you have comments turned on for subscribers, keep an eye on them. People will sometimes spam you, say horrible things or pop into a conversation thread like a bull in a china shop with a “So hottt. C me in Turkiye”. You need to delete stuff sometimes, your friends and subscribers are depending on you to keep the comments cleared. Do this by hovering over the right side of their comment until you see an X. Click to delete the comment.

Don’t be afraid to block people.

If someone is spamming you or being abusive to you or your commenters, don’t hesitate to block them from your page. Do this by first deleting the comment, then you’ll get an option to block the user.

 

What else would you add?

Facebook comments can’t guarantee a lack of anonymity

There’s a conventional wisdom out there in the online journalism world that: 1.) News site comments will automatically be better if people have to use real names, and 2.) Using Facebook for your comments will accomplish this.

I’ve said many times before that I don’t think anonymity is the problem. My campaign on that seems to be a lost cause so far. As a former comment moderator and current manager of social media accounts, I know for a fact that people have absolutely no problem spouting hateful views and violent rhetoric under their real name. I see it every day.

Aside from that, there’s also all kinds of evidence that Facebook comments aren’t the end-all, be-all answer on this front.

As my friend Jeff Sonderman recently wrote at Poynter, Facebook comments can be a boon to news sites in lots of ways: Increased Facebook traffic referrals, fast page load times, an easy out-of-the-box comment solution.

One thing Facebook doesn’t do, however, is prevent anonymity (as the same article and several others insist).

While there is a rule on Facebook that one has to use their real name, it’s not always followed. I have several Facebook friends who use false names for various personal reasons – and they are all, essentially, anonymous. That said, they are still identifiable to their friends, which still keeps some people in check with their online comments. (Though this certainly doesn’t apply to everyone.)

The biggest threat to the alleged transparency and decency of Facebook-powered commenting lies in the same tool many news organizations use to communicate with readers: Facebook Pages.

Just speaking anecdotally here (if you have stats to back me up, please help), I’ve seen an uptick of abusive posts and trolling on Facebook ever since it rolled out its new pages in February. That rollout included the new ability to use Facebook as a page.

This change made it possible for just anyone to set up a fake character on Facebook – and then use Facebook as that character. On The Huffington Posts’s pages (on which I am an administrator) and the pages of other groups and news organizations, I’ve seen these fake accounts spreading spam, trolling the page’s regular users and making hateful statements under the guise of a made-up character.

Here’s a view examples of some alias accounts I found on news pages (or skip below if you want):

[HTML1]

Even before this change, there was a history of false profiles spamming and trolling Facebook, the addition of Use-As-Page to the toolbox only gave trolls a new way to stay in business. Facebook has staff that deals with those accounts when they are found or reported, but it certainly can’t be easy for them to keep up with people who are dead set on being trolly.

(Related aside: When I worked as a comment moderator for the Cincinnati Enquirer, a troublesome site user with many usernames emailed me to say, “I’m retired and have nothing else to do but create new accounts every time you block me. I can make your live miserable.” This is just a sampling of the mentality of trolls, folks. Here’s another.)

Now, I’ve got no doubt that some news sites have seen higher quality discussion after installing Facebook commenting; it’s definitely better than many of in-house or other out-of-the-box solutions I’ve seen on news websites. It likely is the best option for those sites that don’t have the technical expertise and manpower to host and manage a heavy flow of onsite comments day in and day out – so long as they don’t mind handing a big part of their community to Facebook.

I’m just warning that news sites shouldn’t assume that Facebook on its own will solve their commenting problems. Users can and will still be anonymous (or even identifiable) hateful trolls. To make it work, you still need a daily moderation workflow and a newsroom-wide commitment to not only reading story/blog comments, but responding to them.

 

Pay-to-play commenting can eliminate trolls – and kill discussion

Would you give your credit card number to be allowed to have a letter to the editor printed in the newspaper? Think it’s an absurd question? Maybe not.

Beginning today, The Sun Chronicle (in Attleboro, MA) is abolishing anonymous comments the only foolproof way they know how: By attaching usernames to credit transactions.

The paper is charging commenters a one-time fee of 99 cents to be paid by credit card to that each user’s comments and community name will be tied to the name on the paying card (which also is tied to their real address and phone number).

This isn’t all that new, of course. It is a similar approach as what Honolulu news start-up Civil Beat does for their site’s discussion membership level, which charges 99 cents a month via Paypal to leave comments on the site. When Jay Rosen was here visiting us at TBD a few weeks back, he sang the praises of this system for keeping trolls out of their (notably civil) online discussions.

I, as you might gather from past posts, do not agree with the entire premise of this plan for several reasons.

First and foremost, this move can and will eliminate certain segments of the paper’s readership from ever being able to post comments. Aside from the trolls they want to eliminate, the paper can also count out those who do not have a credit card. This can include young people, those with credit problems or otherwise bad finances, those who don’t trust online financial systems – and numerous other possibilities I’m sure aren’t coming to mind right away.

And anonymity, while it can breed ugliness in online comments, has its virtues as well. The ability to speak out without identification is a necessary part of sometimes difficult discussions (like the kind we have on news sites).

Eva Galperin of the Electronic Frontier Foundation expounded eloquently on this point in a different case (involving an embarrassing edict and retraction by the gaming company Blizzard):

Anonymous speech has always been an integral part of free speech because it enables individuals to speak up and speak out when they otherwise may find reason to hide or self-censor. Behind the veil of anonymity, individuals are more free to surface honest observations, unheard complaints, unpopular opinions…

Without anonymity, the comments may end up being quite banal. The next time the Sun Chronicle wants to crowdsource a story (if they do that sort of thing), they can rule out getting anyone to talk openly about their medical conditions, their families, if they witnessed a crime, if they’re having money problems – anything they wouldn’t want the whole community to know.

And finally, is this sort of step really necessary to control comments anyway? As I’ve said before, it is possible to create a robust online community by simply being more engaged as a staff. Better community via interaction is what we aim to do where I work.

Going back to the Civil Beat model, it should be noted the site’s discussions have staff hosts who are an active and visible presence in their threads. How much of Civil Beats, er, civility, is actually better attributed to staff interaction as opposed to their identified commenters?

Of course, that level of interaction requires staff hours most news orgs can’t or won’t spare. There are other, less time-intensive methods that are built into comment systems that other sites have managed to use to control trolls.

As the Editors Weblog noted:

…many prominent publications such as The Globe and Mail and NYT are able to maintain flourishing online communities by instituting a combination of user-rankings (inappropriate comments are quickly down-voted while insightful ones get promoted to the top of the page) and paid moderators.

It seems like a lot of overkill to ban anonymous comments in this fashion when there are other options available that can yield similar results – and yet still allow open discussion.

An anonymous comment ban could kill the public forum

In light of the Cleveland Plain Dealer‘s recent outing of an anonymous commenter on their site, columnist Connie Schultz comes out against anonymous comments on news sites altogether.

I’m not at all surprised she’d take this stance – most reporters seem to feel this way because (I theorize, anyway), they have to put their names on everything they write and wish everyone who attacked their work had to do the same. It’s understandable, but in a lot of ways also very hypocritical.

Journalists want whistle-blowers to rat out government, friends and bosses and live for meaty quotes sharing unpopular or even dangerous points of view. We’ll also usually be happy to let you express those opinions anonymously — just so long as we get to put our bylines on them. We want to serve as a community hub and “voice of the people”, but only want to allow certain opinions to be heard.

The commenters on the story note readers appreciate knowing who is saying what and many acknowledge that it probably would improve the tenor of comments – but they also know it will cut back on dialogue at large (and not always the bad kind). Here’s a comment from a user named RVA123:

There are some risks with requiring names on Cleve.com forums: Though you may be able to ultimately verify authenticity, creating and posting false names will still be too easy for motivated trolls. It probably reduces participation – – which can be perceived as a good thing if it reduces irresponsible posts written solely to drive a negative reaction, and a bad thing if it kills your conversations (and a potential revenue stream for the site) altogether.

Several other commenters note they’d be less likely to share opinions under their real names because they don’t want their bosses and neighbors to know their political leanings, what they watch on TV, where they live or what they REALLY think of their jobs. It isn’t that they have something to hide or have such outrageous opinions they’d never want their names attached – they just want the modicum of privacy they feel the Internet has provided in the last decade or so.

So is less conversation really what we want? Is it better if we have fewer opinions so long as they’re all bylined and well thought-out? From the reactions I hear in my own newsroom every day, I’d say it’s an overwhelming opinion that yes, that’s exactly what we want.

I don’t like being in the position of defending the sort of toxic, anonymous comments that currently permeate news sites, but I believe we as an industry are clinging to an outdated model of what it means to allow the community to have its say. We think that by printing a handful of letters to the editor we are responsibly letting readers have a say because they put their names on those letters. Never mind that those letters usually don’t represent an entire generation of readers – one that tends to do most opinion-sharing online using social media – and are overwhelmingly submitted by white writers.

Aside from any demographic arguments that could be made (and I’d love more and better data if anyone has it), I know how I feel about what I read. My local letters to the editor regularly seem to me to be written by people who aren’t my age and don’t have much in common with my way of life, so I don’t consult them to find out real community reaction on the issues I care about and neither do most of my contemporaries. I turn to blogs, Twitter, Facebook and, yes, the comments on the stories themselves, to see what people have to say. There are a lot more of them – and they’re often far more familiar to me.

If news sites were to eliminate anonymous comments, we should consider what kind of reader would be left out in the cold. Not every anonymous commenter is a racist stalker with an axe to grind – so maybe we shouldn’t be so quick to throw the proverbial baby out with the bath water.

Devil’s advocate: Like it or not, site comments represent the community

All of the talk here and elsewhere on news site comments lately has had my brain working overtime. It’s obvious from all the, heh, commentary, that the content of news website comments is a big thorn in the side of most journalists and steadfast news junkies. I hear about it every day.

“They’re toxic.”

“That’s not conversation.”

“They don’t represent the community at all.”

Or do they?

It isn’t a possibility I as a member of the human race would like to face, but what if these comments that we insist only come from fringe corners of the mean old interwebs really do represent our communities?

Consider this… When I encounter particularly prolific, appalling or trollish accounts on Cincinnati.Com, I’ll look up their IP address to see if they’re posting from our coverage area. In these random hunts, I have never found one that wasn’t local.

For better or worse, these members do represent part of the readership we claim to serve. As ugly as it might be, they are part of the fabric of this community, so should we as a news organization and conversation hub be trying to suppress their opinions?

We know, at the very least, they represent the most vocal and opinionated elements of the community. They simply care more than those who oppose them.

So how much responsibility does the community itself bear for allowing toxic, racist, partisan trolls to represent the coverage area at large? If the rest of the community has a problem with their viewpoints, registration on Cincinnati.Com is free. Why not take them on? At the very least, you to are free to correct them and share your views, too. You can’t let the crazies win.

I don’t necessarily believe this, of course. I know good moderation, staff interaction and better comment tools can help shape comments into conversation. These are, however, the sort of questions we have to be asking ourselves if we as journalists really want to be part of the communities in which we live and work.

“These people” are out there. Some are subscribers. All are readers. Chew on that for a bit and let me know what you think.

Anonymity isn’t to blame for bad site comments, it’s a lack of staff interaction

A Twitter discussion I glimpsed Sunday – and follow-up blog post and discussion about it from Steve Buttry – has had me thinking a lot about anonymous commenting on news sites yesterday. Of course, a lot of that also comes from the fact that I returned from a week-long furlough to moderate comments on the morning after the health care reform bill passed (I don’t know what the mood is like where you are, dear reader, but it’s pretty heated here in Southwest Ohio).

As I’ve written here before, it is part of my job to navigate the waters of Cincinnati.Com’s article and blog comments to determine what should stay and what gets removed as per our terms of service. Back in 2008, I helped set up the site’s comment system, wrote our discussion guidelines and laid the groundwork for how comments would be moderated. The process has evolved and grown to keep up with what we’ve learned from interacting with and watching our community members – and it’s given me a unique perspective on anonymity and commenting.

Of all the comments I’ve removed and all the users I’ve had to block from our sites, I’ve learned a few things that have led me to believe that anonymity doesn’t really matter at all. Here’s why:

1. Most users who have had comments removed do not believe their comment was racist/homophobic/libelous/spam – and they would see no problem posting that comment again (and again) under their real names.

2. Most users who have comments removed or are kicked off the site have no problem contacting staff by phone or email to complain, thus dropping their anonymity in most cases. Aside: The tops is when they use a work email address to defend their statements about how “X race is too lazy to work”. Hilarity.

3. Banned or unverified users will find a way to post what they want to post. Whether it is creating a fake Facebook/OpenID identity, a new IP address, dozens of Hotmail addresses, cleaned cookies – they’ll do it to get around a login system. There are about five users I have kicked off our site dozens of times – and there’s seemingly nothing I can do to get them to go away permanently. One even went so far as to tell me, “Do what you want. I have nothing but time on my hands – and you don’t.”

On the flip side, I am a longtime member of a message board that has very few of these problems. The site’s thousands of users know and respect one another for the most part, conversations stay on-topic and free of hate speech and I rarely see users or comments removed. What’s their secret? Constant moderator interaction.

A moderator is always online -and there is an indication of this that shows up on the forum. The moderator regularly participates in discussion, responds to questions and, most importantly, will give warnings publicly when they are needed. It’s not uncommon to see a gentle “Hey guys let’s try to get this back on topic” or “I had to remove a few posts that got pretty heated, try to keep it civil, folks”. Sometimes the moderators don’t even have to do this. Other members will band together to fight off a troll – or defend a friend they feel was wronged. This sense of community derives from the understanding that there’s safety and support supplied by that moderator presence.

Contrast this with the moderator involvement on most news sites. Most users don’t even know a staffer was reading their comments until they are removed. Chances are most users don’t know a site’s moderators until they get a warning. We all know what the solution is, but our paper – and most other sites like ours – is not able to put that amount of manpower into moderation. Community interaction is not a top-level priority to most news outlets – and that’s the real problem.

We as an industry like to collectively wring our hands about the toxicity of online comment boards, but if we really want to improve the quality of on-site discussion we need to be willing to get involved in our sites in a hands-on manner. No amount of word filters, comment-detecting robots and user-end moderation will replace the presence of a dutiful moderator (and that, unfortunately, requires money).

A new media how-to roundup

Every now and again I try to pass along tips on how journalists at any point in their career can add to their skill set. Here’s some great tips and how-tos I’ve found lately you might find helpful if you want to break into media – or break out.

  • Taking the plunge and starting your own blog or news website? OJR has a great checklist to help you get off on the right foot. Whether you’re a college student or a mid-career journalist looking to get your name out there in a new way, this should really help you figure out your plan. And, if you use WordPress to host your blog or site (I recommend it), here’s a friendly DIY guide to WordPress troubleshooting from our friends at the OJB.
  • If you’re looking for a new online storytelling or crowd-sourcing technique, try using a lifestream or eventstream to tell a story in a narrative form using tools like Tumblr or Posterous. Using a stream, you can combine blog posts, tweets, images and other sorts of updates around a subject from several different people to flow into a single “stream” in chronological order. It’s sort of like a Friendfeed that tells a story. Try it out.
  • Or if you want to get really experimental, try the “mapped” writing model for online news. This technique isn’t so much a narrative as a “choose your own adventure”  for long-form news. It involves an overall summary (or nut graf, if you will) followed by a series of “threads” that don’t need to be read in a particular order. I learned about this model back in online journalism class back in j-school – and I never thought it would come into use. Whaddya know.
  • Data fiends, multimedia producers and Flash fanatics can get great ideas for unique and innovative maps from 10,000 Words. Data visualization is a big deal for online media, buy now the key is making those maps simpler, prettier and fun. (Note: The images on the post are blown out, but it’s a solid list of examples). If you’re just a wannabe data fiend, the blog also has tips for finding and visualizing data. Very cool.
  • User-generated content doesn’t have to mean “amateur” content. The Knight Digital Media Center offers up some great tips for training citizen journalists that could make submitted news a valuable information asset for your site (and it helps the community too). Remember, not everyone had to sit through several credit hours’ worth of copy editing class – so just be patient.
  • Reporters, in particular, should consider expanding their social media brand by setting up a YouTube account. Those cats at Old Media, New Tricks have great how-to advice for branding yourself on YouTube. Yes, it can be more than just funny cat videos.
  • Take it from me, it’s tough to manage comments on your blog or news site, let alone learning to love them and use them to your advantage. I think a lot of the opinion in this piece is a bit pie-in-the-sky (because I’ve been there), but they offer good tips, nonetheless, for understanding online communities and managing commenters.
  • If you haven’t been using Twitter lists yet, here’s Mashable’s primer on what they are and how they work.
  • This is more for organizations rather than individuals, but Social Media Today has tips for making employees into effective Social Media Ambassadors. Hint: It goes beyond just getting everyone on Twitter and calling it a day.

No winners in online comment debate

From the time newspapers took a cue from blogs and added comments to online stories, we’ve been embroiled in debate.

I’ve been in the thick of it at Cincinnati.Com, where a big part of my job is in monitoring our site’s comments, maintaining our moderation policy and fielding lots of angry correspondence from staff and readers regarding those comments.

Everyone wants to debate whether the comments have any value (they do to those who comment), who’s responsible for their content (the law says the commenter is responsible, not the institution, but that doesn’t stop people from insisting otherwise). They question if removing comments is stifling discussion on a topic – and if that’s such a bad thing.

Online comments are a gigantic albatross for our sites, but I believe we need them. While the amount of racist remarks, predictable political attacks and name calling on our stories could fill a book – they are worth it for the ones that really reflect a community’s mindset.

Newspapers are supposed to be a community hub – and we can’t fill that role without giving our readers a way to respond to the news the same way they do other places online.

At Cincinnati.Com, we do after-the-fact moderation (where users can report comments) on our 10,000+ comments each week in accordance with discussion guidelines we set up in May 2008. It’s an imperfect system – stuff stays on the site that shouldn’t because it isn’t found or reported – but people get to have their say.

When the comments really start flying off-base, sometimes we have to make the choice to remove them altogether. WaPo Ombudsman Andrew Alexander recently wrote about that paper’s recent struggle with such a choice. They use the same system as the Enquirer and it failed on them when a subject of their story was vilified by his family in the comments….

Powered by WordPress & Theme by Anders Norén & Hosted by Pressable