Dispatches from the living amongst journalism's walking dead

Tag: blogs

TBD experiments in community engagement: Week 1

It’s the end of our first week on business at TBD and, admittedly, I’m completely exhausted. We all are.

It felt like a good first week for us – we got a lot of reviews, positive and negative, from other media sites and blogs. Despite the bugs and occasional complaints, we did have the opportunity to come out of the gates with a few engagement experiments you might find helpful at your own news orgs.

Open discussion on launch day

We had an open Cover it Live chat on the Community Blog from 9-4 on launch day. TBD Community hosts Lisa Rowan, Jeff Sonderman, Daniel Victor and Nathasha Lim took questions, complaints and bug reports from site visitors in an open and honest fashion. They didn’t just address the positive, they also did what they could to assuage the fears of those missing the former websites for WJLA and News Channel 8, now replaced by TBD.com.

Crowdsourcing for breaking news photos

On Thursday, the Washington, D.C. area woke up to severe thunderstorms, high winds, flooding streets – and a lot of damage. While our one full-time photographer was able to get a lot of art, we knew we couldn’t be everywhere. The call was sounded for photos on Twitter and on the site – and readers responded with submissions on-site and via Twitpic.

We ended up repeating this process later in the day with a reported electrical fire near the District’s business center. I first saw reports and Twitpics of the fire on a random Twitter search for “Fire near: Washington DC”. We quickly reached out on Twitter for permission to use the photos – and we were off to the races. It was great to get such good response out of the gate.

Working with bloggers on breaking news

Around 1:30 pm Tuesday, I looked over one of my series of Twitter searches and found a tweet reporting an alleged hit-and-run by a Metrobus in Arlington, Va. I contacted the guy, Matt, via reply and asked him if he’d talk to our Arlington reporter, Rebecca Cooper. He agreed.

At 2:12, network partner site Unsuck DC Metro, who the original tweet was directed toward, had a post up with the tip.

Another partner site, ARLNow, had a story with photos and quotes from the man involved in the accident at 3:07. TBD had a story with the tipster’s report and ARLNow’s report up before 4 p.m, approximately four hours before The Washington Post or WTOP (and a hat tip to the Post for promoting the great efforts of ARLNow).

Without the tip provided by Twitter and the hustle by the bloggers in our community network, there’s no way we could have had such a story so fast. Who says bloggers aren’t journalists? Not us.

Tapping into the crowd for political coverage

Questions submitted via Twitter hashtag

Questions submitted via Twitter hashtag

On Wednesday, TBD TV’s Newstalk program had the Democratic candidates for D.C. mayor on the program for a debate. In the hours before the 10 a.m. debate, we asked readers to submit their questions for the candidates via hashtag on Twitter. The response was more than we could fit on the program, which was great (see right).

When the debate went live on TV and online, fact-checking reporter Kevin Robillard had a live Cover it Live chat where readers could chime in with comments, ask questions and suggest facts to be checked as the candidates said them on the air.

The debate got a lot of traction on Twitter and on the chat. Kevin had some great material for The Facts Machine, which is a TBD blog dedicated to backing up or refuting questionable facts.

We hope to do a lot more projects like this in the future. Not bad for the third day out.

Anonymity isn’t to blame for bad site comments, it’s a lack of staff interaction

A Twitter discussion I glimpsed Sunday – and follow-up blog post and discussion about it from Steve Buttry – has had me thinking a lot about anonymous commenting on news sites yesterday. Of course, a lot of that also comes from the fact that I returned from a week-long furlough to moderate comments on the morning after the health care reform bill passed (I don’t know what the mood is like where you are, dear reader, but it’s pretty heated here in Southwest Ohio).

As I’ve written here before, it is part of my job to navigate the waters of Cincinnati.Com’s article and blog comments to determine what should stay and what gets removed as per our terms of service. Back in 2008, I helped set up the site’s comment system, wrote our discussion guidelines and laid the groundwork for how comments would be moderated. The process has evolved and grown to keep up with what we’ve learned from interacting with and watching our community members – and it’s given me a unique perspective on anonymity and commenting.

Of all the comments I’ve removed and all the users I’ve had to block from our sites, I’ve learned a few things that have led me to believe that anonymity doesn’t really matter at all. Here’s why:

1. Most users who have had comments removed do not believe their comment was racist/homophobic/libelous/spam – and they would see no problem posting that comment again (and again) under their real names.

2. Most users who have comments removed or are kicked off the site have no problem contacting staff by phone or email to complain, thus dropping their anonymity in most cases. Aside: The tops is when they use a work email address to defend their statements about how “X race is too lazy to work”. Hilarity.

3. Banned or unverified users will find a way to post what they want to post. Whether it is creating a fake Facebook/OpenID identity, a new IP address, dozens of Hotmail addresses, cleaned cookies – they’ll do it to get around a login system. There are about five users I have kicked off our site dozens of times – and there’s seemingly nothing I can do to get them to go away permanently. One even went so far as to tell me, “Do what you want. I have nothing but time on my hands – and you don’t.”

On the flip side, I am a longtime member of a message board that has very few of these problems. The site’s thousands of users know and respect one another for the most part, conversations stay on-topic and free of hate speech and I rarely see users or comments removed. What’s their secret? Constant moderator interaction.

A moderator is always online -and there is an indication of this that shows up on the forum. The moderator regularly participates in discussion, responds to questions and, most importantly, will give warnings publicly when they are needed. It’s not uncommon to see a gentle “Hey guys let’s try to get this back on topic” or “I had to remove a few posts that got pretty heated, try to keep it civil, folks”. Sometimes the moderators don’t even have to do this. Other members will band together to fight off a troll – or defend a friend they feel was wronged. This sense of community derives from the understanding that there’s safety and support supplied by that moderator presence.

Contrast this with the moderator involvement on most news sites. Most users don’t even know a staffer was reading their comments until they are removed. Chances are most users don’t know a site’s moderators until they get a warning. We all know what the solution is, but our paper – and most other sites like ours – is not able to put that amount of manpower into moderation. Community interaction is not a top-level priority to most news outlets – and that’s the real problem.

We as an industry like to collectively wring our hands about the toxicity of online comment boards, but if we really want to improve the quality of on-site discussion we need to be willing to get involved in our sites in a hands-on manner. No amount of word filters, comment-detecting robots and user-end moderation will replace the presence of a dutiful moderator (and that, unfortunately, requires money).

NYT giving lessons in ineffective revenue models?

Last week, I and pretty much every other media blogger on the earth wrote about the potential problems facing the New York Times’ plan to charge non-subscribers for using their site. Giving a bit of credit where it is due, the Times has evolved it’s metered paywall plan to not charge those coming into stories from blog referrals, emails and social media (which had been a big concern of mine).

While this change is great in that it recognizes the importance of the passer-by reader, it does present a challenge in the sense that most online readers fall into this category – so what kind of money can they get from charging for this content in the first place? As others have noted, it isn’t even as if they’re charging for content now, just for the ability to use their site navigation. In other words, they want to kill their section front traffic, but keep their story-by-story page views.

The Times’ Opinionator Blog even grudgingly admits this seems like a bit of a back-off. No surprise, of course,  a NYT writer thinks the metered paywall is a good idea, but he realizes that online readers do not simply navigate to a newspaper site to peruse the news, they get their news from a combination of search, aggregators (including their own RSS readers) and recommendations from friends. If this trend continues and these sort of readers increase in number (which they will, as this is the preferred newsreading method of my generation and those younger), this porous paywall thingie doesn’t look much like a revenue model at all. It’s half-assed at best.

Which begs to mind the real question: Did the Times even really think this out? They made all kinds of big news when they first announced the metered paywall last week to all kinds of old-school-media backpats, but then they started immediately  backpedaling.

It’s made me wonder if they really had a firm grasp of what they sought to accomplish – audience and revenue-wise, with this plan from the get-go. I have to wonder, how much more will it change before it is implemented? And why did they announce this plan when they don’t seem to be very cognizant of what it will be or what they want out of it?

Jay Rosen hosts something of a debate about all of this on his blog. I suggest a read through the comments for a good look at what the reaction’s been to all of this re-jiggering.

Journalism and the Interwebs: A Reading Guide

I read a lot of industry blogs and they generally all boil down to two topics: complaining about the Internet (or complaining about people complaining about the Internet) and lamenting the future of news.  It makes it all a little tough to keep up with what actual issues we’ve settled this year and what’s still out there to be figured out.

Thankfully, the Nieman Lab Blog took the time to assemble what dominated discussion regarding the future of news this year and takes a look at what will likely be hot topics next year as the industry continues to reel and (hopefully) evolve.  Most notably, next year seems to be heading in a direction of looking beyond the industry itself to what the affects the changes in the industry will (or should) have on journalism education, politics and public policy.

And in the second camp of journalism industry blog posts, Paul Bradshaw reviews all of the complaints news folks have had against The Internets over the years in one fell swoop. From hating on Google to opposing blogs and user-provided news, he offers something of a summation of just how depressing some news execs can be when it comes to that which they don’t understand.

Getting to know our friends in the blogosphere

Have you read the 2009 Technorati State of the Blogosphere Report yet? It’s got some great demographics about bloggers that online news orgs would be good to know, as a lot of them are voracious news consumers.

The report was compiled based on a survey of 2,828 bloggers, blog provider statistics and interviews with many key bloggers.

Fun facts from the study:

  • Bloggers are generally more affluent than the average person
  • The blogosphere continues to be dominated by male, affluent and educated bloggers
  • Most bloggers are “hobbyists” and are driven by personal fulfilment rather than financial gain.
  • The survey found that contrary to popular belief, many bloggers have had professional media experience, with 35% of all respondents having worked in traditional media as a writer, reporter, producer, or on-air personality, and 27% continue to do so.
  • While bloggers read other blogs they do not consider them a substitute for other news sources and the majority do not consider online media more important than traditional media.
  • 31% don’t think newspapers will survive the next ten years.

No winners in online comment debate

From the time newspapers took a cue from blogs and added comments to online stories, we’ve been embroiled in debate.

I’ve been in the thick of it at Cincinnati.Com, where a big part of my job is in monitoring our site’s comments, maintaining our moderation policy and fielding lots of angry correspondence from staff and readers regarding those comments.

Everyone wants to debate whether the comments have any value (they do to those who comment), who’s responsible for their content (the law says the commenter is responsible, not the institution, but that doesn’t stop people from insisting otherwise). They question if removing comments is stifling discussion on a topic – and if that’s such a bad thing.

Online comments are a gigantic albatross for our sites, but I believe we need them. While the amount of racist remarks, predictable political attacks and name calling on our stories could fill a book – they are worth it for the ones that really reflect a community’s mindset.

Newspapers are supposed to be a community hub – and we can’t fill that role without giving our readers a way to respond to the news the same way they do other places online.

At Cincinnati.Com, we do after-the-fact moderation (where users can report comments) on our 10,000+ comments each week in accordance with discussion guidelines we set up in May 2008. It’s an imperfect system – stuff stays on the site that shouldn’t because it isn’t found or reported – but people get to have their say.

When the comments really start flying off-base, sometimes we have to make the choice to remove them altogether. WaPo Ombudsman Andrew Alexander recently wrote about that paper’s recent struggle with such a choice. They use the same system as the Enquirer and it failed on them when a subject of their story was vilified by his family in the comments….

Do we miss the point of “hyperlocal”?

I think every medium and metro-sized newspaper has had this conversation in the past few years:

Editor #1: People aren’t going to our website to read state and national stories. It’s all the fault of that darn CNN and such.

Editor #2: Well, maybe so, but we’ve still got Community X.  They don’t do news there.

Editor #1: Maybe we’ll build a whole website just based on news from Community X! It’ll be awesome! Yeah, we’ll get, what do they call it?

Editor #2: Hyperlocal.

Editor #1: Right.

And so the hyperlocal news sites were born across the country. Some featured original reporting by staff, others were built on the work of citizen journalists. Some have already failed as others have taken on a life of their own.

When the Washington Post – the giant of the newspaper web world – decided to create a “hyperlocal” site based on Loudon County, Va., it was a big deal. Of course, their idea of hyperlocal was a group of loosely-connected communities instead of the communities themselves – but they’re the WaPo, if they want to call it hyperlocal, they can. Two years later, the  WaPo announces its closure of LoudounExtra. Sure, the post says, they’ll still COVER the area, but it won’t have its own website anymore.

About a year ago, the Wall Street Journal saw this coming, charging that the WaPo didn’t understand what it meant to be hyperlocal in the first place. I’m inclined to agree. What I see from a lot of big news outlets is a page collecting their stories on the area and little more – that isn’t hyperlocal coverage – it’s a hyperlocal aggregate feed.

What makes a good hyperlocal site isn’t just collecting a bunch of stuff about that area and throwing it up on a web page – it’s about understanding the community on a ground level. It helps to live there, but merely getting out there and getting to know people is a start. From what the WSJ post said, the staff at LoudonExtra wasn’t very invested in the area:

To penetrate those communities requires a more dedicated effort than the LoudounExtra.com team was putting forth. [The manager of the project] acknowledged he spent too much time talking to other newspaper publishers about the hyperlocal strategy and too little time introducing his team and the site to Loudoun County.

Whether that is ultimately why the site didn’t get enough traction to remain independent is a leap I won’t take – but it certainly would make sense. The WaPo, while it does serve a local audience in addition to its wide national base, may not be the experts at knowing what’s going on in Middleburg, Va. Who does? People on the ground in Middleburg, that’s who.

The best local-local writers are invested at a micro level. For instance, Mission Local, a neighborhood news site created through a hyperlocal news project out of UC Berkeley’s Graduate School of Journalism. Their site has news important to those living in the area – stories of all sorts, a police blotter, maps. If you check out their About page, you see that the publication is based in the Mission District and many of the writers are residents there.

Another great example, the West Seattle Blog, is a husband-and-wife team focused on a very specific part of a larger city in which they live. I had the opportunity to meet them and hear about their operation when I was a fellow at the Knight Digital Media Center in March. They both have backgrounds in journalism and took that expertise to cover their own neighborhood. As a result – they regularly publish what’s going on before their local metro.

Their crime page keeps a running tally from scanners and crime reports from residents. They have community-level announcements that come in from submissions. In addition to their own writing and reporting, they also have a selection of news and opinion from other bloggers in their area. All in all, they have a lot of content – all local (or hyperlocal!).

Even if there isn’t a person physically on the ground in the neighborhood, it takes knowing what people want to see from their area and how specific they may want it to be. “Drilled down” news can be done at a larger level – and it has value, if this week’s purchase of  “microlocal” network EveryBlock by MSNBC is any indication.

As Paid Content  said about the sale, EveryBlock had more value than LoudonExtra simply because of its focus on microcosms of communities – not just clumping a whole county together and calling it a community. The Dupont Circle page in EveryBlock is a great example. It has crime report maps, police calls, blog posts and more from a very specific area – pretty useful stuff if you live there – and most of it available from public information.

So the moral of the story is – don’t judge the future of “hyperlocal” news from the WaPo’s failed experiment. There’s gold in them there hills – but we have to actually work at making it accessible and useful.

* Eds Note: For the sake of disclosure, my current paper has a couple incarnations of these products. Cincinnati.Com has more than 100 community-level aggregate sites, including a few with their own discussion forums (and all featuring some pretty nifty maps if you ever want to check them out).

WaPo v. Gawker: Battle in the Blogs

This week, for some reason, Gawker is suddenly Public Enemy #1 to the online media world. It seems to be because they’re doing pretty well when it comes to online revenue and they do it largely by blogging about the news researched by other sources.

The reason it’s suddenly a big deal is that a writer at the Washington Post, Ian Shapira, finally decided to throw a (well-written) snit about Gawker blogging about one of their pieces. Shapira charges that Gawker infringed on the copyright of his work because so much of their post was derived from his story.

Gawker’s post quoted heavily from the source’s quotes in the Post story  in fact, slightly more than half of their very short post was from the WaPo story. The Nieman Journalism Lab took a look at what was used and asked it’s readers if they thought Gawker violated Fair Use or fell well within its guidelines.  The comments are well worth a full read, as they really put the heart of the debate right out there:

1. The Gawker post clearly qualifies as Fair Use. Commenter Justin reminds us that the code states that content use “for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright.” Comment and criticism – what else is Gawker if not that?

2. Despite Shapira’s claims to the contrary, the Post did get credit. Sure, Gawker could have said it came from the post before the end – but they gave them something far more valuable. They linked to the original story – several times in fact. As commenter (and excellent young blogger) Cody Brown says, in the online world, that’s the best credit you can get.

3. Was the Post damaged by it? Hardly. Shapira noted that Gawker was the #2 referrer on the web to his story and likely contributed quite a few new readers to an otherwise mundane story that may not have had a lot of legs online otherwise.

4. Who owns the quotes from the source anyway? If Gawker should cut the post a check for quoting their piece and selling ads around it (which the WaPo writer suggests in jest), what does the Post owe their original source for selling ads around her quotes? (And furthermore, does reporting count as aggregation, too?)

5. Would the Post be complaining if it wasn’t Gawker? That’s debatable. As the commenter notes (and I say all of the time) other newspapers, broadcast and wire services do this quite a bit too – why isn’t there any more outrage about that?

I really question why Shapira’s editor even let him write that follow-up charging that Gawker stole from his work. Does Shapira really have a background that makes him knowledgeable enough about these sticky issues of fair use and media law that he can make claims that even experienced media lawyers aren’t altogether clear on? Also, how many of the Post’s online readers even care about this issue? You know who cares to hear about how much work Shapira put into this everyday story only to have it “ripped off” by big, bad blogs? Journalists. That’s about it.

How much of this whole debate – not just the WaPo v. Gawker, but the whole blogs/aggregators vs. old media – is based in old-fashioned jealousy? Chris Krewson, editor of the Philadelphia Enquirer, said this to me on Twitter: “Aren’t we at least a little annoyed that Gawker and the aggies are faring well, ad-wise?”

Yes, I think we are. Gawker’s media sales have shot up this year. Ad revenues are up 45% year-over-year for the first six months of 2009 – and their production costs fall way below that of a newspaper. But isn’t that just good competition?

Maybe we just need to be better.

Here are more related posts about the whole Gawker debate you may find interesting:

  • Journalism’s Problem Isn’t Gawker. It’s Advertising. – The Atlantic Politics Channel – Atlantic’s followup analysis to the Nieman Lab post. Gawker isn’t the issue here, they insist, online advertising is the real issue – so maybe all of these people wringing their hands about Gawker and the like should focus on the task at hand. (amen)
  • Gawker’s Link Etiquette (or Lack Thereof) : CJR – An interesting look at Gawker’s linking habits. As the CJR notes, what they do falls within existing Fair Use guidelines and they DO link to the original piece – just way, way down in the story. I don’t agree with the practice, but I also don’t think we need a law that makes Gawker link to the original higher in the story.

Really, Plain Dealer?!?

First of all, it should be stated that I’m a big fan of The Cleveland Plain Dealer and Connie Schultz, who is a Pulitzer Prize winner and fellow Kent State alum. That said, they are completely out of their minds. Today, they give yet another gigantic middle finger to the entire Internet in a “story” that reads a bit more like a very smug blog post promoting their misguided efforts to stop the interwebs from doing what interwebs do.

Some backstory, if you don’t know it:

In what started as a plan to get a lawyer’s name in the news became an incredibly uninformed column by Schultz and eventually evolved into embarrassing sideshow that has a newspaper pulled into an effort trying to limit the First Amendment rights of bloggers and asking other sites not to give them web traffic. Oh, and it also calls aggregators, RSS readers and bloggers “parasites”. Nice work, guys (facepalm).

This “plan” to change U.S. copyright law, put forth by David and Daniel Marburger (brothers and a lawyer and economist, respectively), seeks to ban aggregators and bloggers from linking or paraphrasing news content within the first 24 hours of its creation.

TechDirt has an excellent analysis on all of the things that are wrong about this half-baked plan. The least of which is that it conveniently ignores significant traffic their own site gets from aggregators every day. I can speak with some knowledge on that fact – Cleveland’s website regularly features links to our stories that regularly show up as popular referrers in our traffic reports (and we love them for it).

As Jeff Sonderson also points out, the PD would be outraged if they themselves were held to this standard. We all would:

How would the Cleveland P-D like it if their new copyright law prohibited them for 24 hours from reporting plane and train crashes, celebrity deaths, political scandals, or anything else that Twitter, TMZ, Talking Points Memo or the Drudge Report had first?

Schultz, for her part, really misrepresents aggregation in the first place. She says these “parasitic” aggregators “reprint or rewrite newspaper stories, making the originator redundant and drawing ad revenue away from newspapers at rates the publishers can’t match.”

Actually, a true aggregator would have a headline from the originating site with a description of the story – usually auto-generated by the original site – and a link back to the original story. You know, PD, if you don’t want your stories to go out to aggregators, maybe you shouldn’t make RSS feeds available for them in the first place. Just a thought.

The Marburger report, at least, somewhat seems to understand the term, but still has the wrong bad guy. Their focus is not actually on true aggregators, but rather on bloggers and other competition in the market who don’t have a reporter on the scene for the original report, but tend to write an analysis or report based on what the original source published. This is commonplace – and I can state as a matter of fact that it is done by “professional” news outlets every day. Not to mention it is a pretty standard practice of the AP, which is featured on the PD’s news pages. Et tu, Brute? (That’s sarcasm, kids)

Depressing, isn’t it? While I agree that copyright law needs to be updated for the digital age, this isn’t what I had in mind.

In a roundabout way, this all continues to prove my point about newspapers pointing fingers at the wrong bad guys. After all – they too have links to Digg and other social sharing sites on their stories and blogs. Funny the way it is….

Recommended reading: Content, traffic and pay walls

It’s time to cut off support for Digg

Digg.com has been into more shenanigans – prompting this content provider to ask: Have they gone too far? And if so – why do we in online media continue to support them?

On Monday, Mashable confirmed that Digg surreptitiously changed the behavior of its short URLs in a fashion that diverts web traffic intended for content publishers’ sites to Digg.com.

The move has the social web in an uproar – and should have media websites shaking in their boots. It seems that social media site many of us in online news have taken to caring and feeding with the content that makes it so popular has turned on us in a big way.

Digg URLs are/were very popular with users of Twitter and other microblog services wishing to share links. Then, without alerting its users, Digg has made it so those shortened external links no longer go to that great blog entry or article you wanted to share – but rather it links to directly to Digg.com. Do not pass go, do not collect your page views. In short, the Digg URLs are not shortned URLs at all, but rather a Digg-exclusive traffic driver.

Only tonight has Digg at least somewhat rolled back this change to restore previously used Digg URLs to their original destinations. Even so – they intend to go forward with the traffic diversion plan despite the outcry from users.

I suppose we in online news should have seen this coming. It wasn’t the first sign of aggression from Digg.  I’d say Digg has more than proven that it is a direct threat to content publishers – so why are we still supporting it? Oh, you didn’t know you were supporting Digg? Better take a look at your site.

Check out the articles, blogs, photos and any other content you create. Chances are, there is some method for sharing that content online with the likes of Facebook, Twitter, Delicious and, yes, Digg. Sometimes that is a button that says Digg, other times it may be a service like the ShareThis button you see on this blog.

See, at one time, Digg was a real boon for online publishers. If your story was popular on Digg, the influx of page views coming from its army of users could be staggering. We wanted everything to be on Digg. In fact, we made it as easy as possible to get our content listed on their site by making these links as prominent as possible.

But it turns out in doing all that reaching out – we contributed to the creation of the very bully who’s stealing publishers’ lunch money. Even though it might not make much of a difference,  we in the online news biz need to take back our tiny corners of the web and at least remove Digg from our pages.

Aside: I know I seem like a hypocrite calling for this, being that I haven’t figured out what to do with my own blog yet, but bear with me.

I’ll fight with online naysayers ’til the cows come home about aggregators and Google – but Digg is a credible threat. It’s time to let them go. Besides, if your site is like that of my employer – they are a drop in the bucket compared to Twitter and Facebook these days anyway. Good riddance.

Powered by WordPress & Theme by Anders Norén & Hosted by Pressable