Dispatches from the living amongst journalism's walking dead

Tag: news Page 2 of 3

Twitter is the perfect place to break news (but don’t tell Reuters)

When Reuters released its new social media policy last week, their competition had to be salivating. The wire service appears to be digging its own grave by stipulating in no uncertain terms that its reporters are not to use social media to break news. All news is to be broken on the Reuters wire, no exceptions.

The idea of spurning social media for breaking news in order to protect your wire service would be a little like an early 90s  telephone service provider spurning the notion of developing an Internet service, instead allowing competitors to use its lines to serve up dial-up service to its customers.

Truth is, Twitter is the perfect medium for breaking news. I think of it as the latest incarnation of the “this just in!” radio bulletin.  As a tool, it is immediate, mobile, searchable by keyword and location, you can easily see who has passed on your news (via RTs), link traffic is easily tracked and, best of all, it has your brand attached so you can get credit for the scoop.

There is absolutely nothing more satisfying to this newshound than a series of re-tweets on my item from readers – and even better when it includes a begrudging re-tweet from my competitors.

If a news outlets that uses the Reuters wire is the first to post an item to a social media, it will look as if they broke that news. Their link to the same Reuters content will be the one passed around from retweet to retweet. One would think they might want to get their name on it first – but  guess not.

I see this play out every day on my Tweetdeck, as the local TV stations battle to tweet out the latest kooky AP news item from 200 miles away first. I always can’t help but think, “Gee, why isn’t the AP trying to get this into this market’s Twittersphere before local news outlets even get the chance?”

In the end, it won’t matter if they broke the news on the wires first. Most readers don’t read the wires, they read either their preferred media site or social media to get their news. As more and more news organizations take advantage of using Twitter to break news (or in the case of the BBC, mandating it), news providers who are late to the party on every story will eventually render themselves pretty useless as breaking news resources.

It’s downright shameful that an industry leader in breaking news (including some of the biggest breaking news events of the 20th century), would just let that go in favor of protecting a corner of the market that doesn’t benefit its readers or its reporters.

I have to say, the rest of the policy is rather helpful. It largely focuses on explaining how journalists can manage professional and personal brands on Twitter, including guidelines for making corrections in the social media sphere and avoiding accusations of bias with a thorough look at one’s social media profiles. All good info to know.

Sunday plan evolves from print-only to print-first

I wrote first last week about my employer, The Cincinnati Enquirer, experimenting with a print-only strategy for certain stories to boost Sunday single-copy sales.

Not long afterward, I was in a meeting where we decided on the next course of this ever-evolving experiment – and came up with a conclusion web readers should find a bit more agreeable.

First_in_Print_logo2This past Sunday, the logo and experiment changed from “Print Exclusive” to “Print First”. This week, the six selected Sunday stories were promoted on Cincinnati.Com and held from online publication until today. This was intended to give more value to the printed Sunday edition without keeping the stories unavailable for online and out-of-market readers. This was a solution suggested by many of those who responded to my post last week (more on that later) and very agreeable compromise in our editor meetings on the subject.

While I don’t know how it worked for print sales, it seemed to work well for us on the online side at Cincinnati.Com. Mondays are notoriously slow for news with art, so these embargoed Sunday blowouts have been there for us to use today in prominent spots – and a few of them (like this piece on Larry Flynt’s family lawsuit – like that  isn’t just primed for the web) are doing very well in terms of page views.

We’ve known for awhile that our online readers and print readers are not usually the same – not just here, but at all newspaper sites. A strategy like this seems to reflect that as well, since the stories we held from online yesterday are today enjoying new life and a burst of traffic (not to mention placement in search engines and linkage from all over).

Simply put, we shouldn’t try to sell our web readers the print newspaper – if anything, we should try to sell them news they want in the format they want it. Newspapers can’t afford to devalue the web audience if they want to succeed in the long run, hence why everyone’s trying to find a way to make money online in the form of paywalls, freemium content, micropayments and whatever else is coming down the pike.

While I’m personally not crazy about some of those plans, I think anything is better than entirely withholding the news from the web audience. Judging from the responses I got last week and what we discussed internally at the Enquirer, I’m not the only one.

Here are some of the responses I was sent via email  and social media to the “Print Exclusive” experiment:

– I purchase the paper every Sunday and truly enjoyed [last week’s] piece on homeless teens….  I was however disappointed when I could not find the article online, as I wanted to email it/tweet it. I see the point in having print-exclusives to drive paper sales, but I am wondering if it might not be possible to post the articles online once the print editions are no longer available?

– If the Enquirer sold the Sunday sports section as a standalone print product, I’d buy that, but that’s all I’d want. Mostly I’m a web reader.

– I can see not putting the content online before print, but don’t make it unavailable to me online. Even if I have to pay for it or buy a day pass to your e-edition, at least I have a way to read it if I want.

– You should be able to “buy” daily copies of the paper online in the e-edition. Maybe even just make the Sunday e-edition a subscription option. I’d buy it.

– This seems kinda bass ackwards to me. You should be increasing your online presence rather than reducing it. I think the proposed pay model for the New York Times is perfectly agreeable and I have no problem subscribing to that.

What about you? What do you think of this latest plan?

Weather coverage made easy

Weather is big business for those of us in news, especially when it gets to be extreme weather like just about every state has experienced in the last two weeks.

Lots of news outlets have developed amazing new ways to get out weather information and pull in interaction from readers, but sometimes what’s simple can work in a pinch.

Most of the time when we’ve had snowstorms in the past, we at Cincinnati.Com have had a basic story file set up that we re-top and add to throughout the day as the news changes. Without the occasional total re-write during the course of the news cycle, it can end up reading like a very long Frankenstein of an article, with the possibility of specific items getting buried in all of the text.

I recently set up a basic WordPress blog specifically to handle weather events news to avoid this problem. It has links to all the basic weather info we have available on the site, a way to search all of the posted entries and tags/categories that make posts easy to browse by topic or location. The blog uses the TDO mini Forms plugin that can allow our reporters – and our readers – to submit updates from where they are.

Even though we haven’t yet gotten a lot of reader submissions, the blog has been immensely helpful from a news management standpoint. Reporters can file to the blog from their homes, phones or satellite offices, all we have to do it click “publish” in our dashboard. No re-writes are necessary because as the story develops, we can just add news posts. The format also provides an easy way to “sticky” important posts at the top and generates an easy link for the day’s event cancellations.

This easy method of publishing updates weather news has been a great supplement to our info releases and content on Twitter, on our mobile site, text alerts and all of the usual photos and videos we bring out fr every story. The blog’s been doing great traffic on storm days and, from my view, has been a huge burden lifted from the backs of already busy online editors (such as myself).

Because this info has such a short shelf life, I’ve just been deleting all of the old content as soon as the storm coverage ends. We don’t want readers coming back for new weather updates only to find outdated info from last week’s storm. I know that isn’t the greatest option for the sake of SEO and outside linking, but it has made it very easy to essentially launch whole new blogs for each circumstance. I’m curious to hear others’ thoughts on what they would do to prevent link breaks and confusion.

Anyway, that’s been our publishing plan these past two weeks – and if it’s something you think you could use, go for it. WordPress is free, quick to set up and has lot of plugins to enhance user experience.

What has anyone been doing to cover these storms online? What have you been reading?

Newsday is paying for that paywall

New York Times cheerleaders and other fans of paywalls should take note of the plight of nearby Newsday.

Newsday went behind a paywall for non-subscribers three months ago, They revealed this week that since then, they’d netted only 35 online-only subscribers. Ouch.

Newsday was banking on their local news coverage being so important to online readers that they’d eagerly pay to access it even though there’s plenty of (free) competition in the NYC/NJ area.. Their redesign made it possible for non-subscribers to see article excerpts, then they’d have to pay $5 per week to read whole stories.

Because of the low adoption rate so far, the web traffic to Newsday’s site has, predictable, plummeted.  According to their Nielsen Online analytics, the site’s page views dropped 30% from October to December, meaning that any non-subscription revenue earned from online advertising is taking a plunge.

Their editors don’t seem to mind – they say it wasn’t about numbers and subscribers, but rather about protecting their brand from freeloaders and offering a ‘premium” product to loyal subscribers. While that’s noble and gutsy, it doesn’t create any new form of revenue to fund an online product. Food for thought, I suppose.

In online news, only the presentation matters

It the industry may finally be learning from our companions in social media and aggregation. We’re starting to see that users want things to be simple, up-to-the-minute, all in one place and, by God, they aren’t going to just read whatever we say they should.

I’ve been working on a project with Gannett that tackles the next phase of our websites’ design to reflect a lot of these observations. I expect the same is happening at news companies all around the nation. I can only hope we all don’t continue to make the same mistakes in designing around the often conflicting interests in content and advertising.

The past couple of weeks have seen the roll out of a few new looks and ideas for online news presentation that really seem to focus on the observed needs and desires of readers, while not ignoring how much the online medium has to offer. These three presentations, in their own ways, seem to fit what we know users want…and quite notably, they dared to design them without ad positions.

NewsPulse

NewsPulse on CNN.com is a great visualization of the idea many of us have had for online presentation. It’s s simple, sortable stream of stories by media type, topic and various measures of popularity. It is essentially Digg without the Diggs and a lot cleaner interface.

Caveat: As a front page web news manager, I hope some measure of importance of news could be factored in as a filterable option, as many people who’d use this product might not otherwise see “important” headlines because they would not be popular or in a topic area they would tend to read. Of course, the user should probably visit another site if they want “important” news anyway (zing).

Living Stories

Living Stories, the new presentation experiment from Google, the NYT and WaPo is exactly what online news should be. I can’t get over how amazing this presentation is and how useful it can be for following a complex, long-term story or topic (like the health care reform).

A Living Story gathers all news updates, opinion, multimedia and conversation on an ongoing story in one place, at one URL. The format is best suited to help a reader see the latest developments in a story, with a timeline of events, important documents and user comments in an easy-to-digest fashion. What I like best about it that it is customizable, cookied for returning visitors to pick up where they left off and easy to follow offsite via RSS and email alerts.

Best of all, if this project works out for all parties involved, Google will make this available to other sites. It’d be a huge improvement in what’s currently available on most news sites, including that of the WaPo and the Times. You can read more about the living story from Paul Bradshaw, who is similarly dazzled.

Real-Time Search

You might not consider it a news presentation, but Google’s real-time search is a perfect format for breaking news. It builds on Google’s already formidable search presence with live news updates on a searched topic from news sites and Twitter (with more to come). It isn’t exactly made for news, but it should be. Maybe if we spent more time working with Google as opposed to trying to fight them, we could get something really great out of a product like this.
We at Cincinnati.Com used Google’s real-time search to supplement our coverage of University of Cincinnati football coach Brian Kelly’s departure for Notre Dame.  It’s an improvement over Twitter search (which we’d usually use) in a lot of ways because it allows you to see the latest news on the topic from blogs and news sites. I do wish that, like Twitter search, it allowed you to customize a geographic range…but that can always come later.

Confessional: Shameless page view ploys

Lest anyone think I’m casting stones without acknowledging my own sins, I decided to share a list of the shameless ploys I’ve used to get page views for my employers and blogs. What I’ve listed is hardly out of the ordinary for any website, but I still feel bad about it sometimes.

If I could go back to when I was in journalism school and share the following information with 2001 Mandy, she’d probably change majors. I won’t say when these stunts were done or who I worked for at the time – but it’s happened. I’ll repent for my sins someday.

Feel free to add your own or others you’ve seen in the comments.

Mandy’s Most Shameless Page View Ploys

  1. Built a photo gallery when a story would have better served the subject matter
  2. Changed the headline and summary to reflect something far more exciting/scandalous than the story’s subject.
  3. Published an online story that only has a paragraph of text and a link to a competitor’s story.
  4. Given premiere position to outrageous crime stories even though news judgment did not warrant it.
  5. Published link bait from the AP and other services even though it was out of our coverage area.
  6. Submitted news content to Digg and Fark before waiting for others to submit it.
  7. Picked the sexiest girl out of a photo gallery to feature for a gallery in a prominent news spot.
  8. Prominently featured crime stories/pet stories/disaster stories on the site long past their expiration date to keep getting page views.
  9. Linked together completely unrelated stories to draw views to unpopular content.
  10. Published content that is indistinguishable from advertising/press releases simply because it will get traffic.

More takes on web analytics for news

Aside from the past couple of rants about web analytics, here are a few other takes on the issue from bigger thinkers than me:

The Online Journalism review takes a look at all of the possible web analytics out there to explain what is what – and what could possibly be the best measure for engagement. One they don’t discuss much is time on site – which I think is one of the best true measures of engagement on a piece-by-piece basis.

On the flip side, the Nieman Lab says that web analytics make us as an industry overexaggerate the importance of the online audience compared to the print audience. I don’t really agree with the methodology, but it certainly makes a case for print getting more money from advertisers.

EConsultancy – a marketing blog of all places – calls out some of the worst ways to drive page views in a page view driven  market. This includes pagination, slideshows (Forbes, we’re looking at you) and self-linking.

Master New Media asks if web designers should optimize sites for page views or user experience. f course, we’d love to tell you you can have your cake and eat it too, but after doing a redesign on Cincinnati.Com last year, I’ve seen the beast – and it isn’t friendly to readers.

Do page views make us biased?

Aside from my little rant abut page views yesterday, there are far more reasons to seek another way to engage online audiences for the good of the overall product.

Eat Sleep Publish really lays out a great case against page view-driven news value. The author, Jason Preston,  suggests page view goals create a conflict of interest for news managers. As a daily online news manager for a metro news site, I can see where he’s coming from.

He notes that the overall value of a story to many news organization lies in how many page views it receives online. When everyone’s competing to not be the next laid off, it’s only natural for a reporter to write in such a way to get page views or for an editor to arrange placement for a story based on how many page views they think it might get (as opposed to its actual news value). The latter, I’ll admit, happens all of the time.

Does this make us, the newsroom types, in the employ of advertisers? Sure, we may not know who they are exactly – but does it represent a bias to push for them to make more off of ad impressions? Very intriguing food for thought. I’d be interested in hearing more opinions on this, people.

Daily news podcasts

I spear-headed the start of the Journal Sentinel‘s daily news podcast in 2006. It started with me or another member of the online staff writing, voicing and producing the podcast in Sound Edit Pro each night, but eventually I trained several copy editors to join in a daily rotation for podcasting.

Here are a couple of podcasts I voiced and produced myself during that time. Once I get the stupid Flash player to actually work on this site, I’ll have a pretty audio player to play them here.

Here’s one from February 6, 2007 and one from March 1, 2007.

Do we miss the point of “hyperlocal”?

I think every medium and metro-sized newspaper has had this conversation in the past few years:

Editor #1: People aren’t going to our website to read state and national stories. It’s all the fault of that darn CNN and such.

Editor #2: Well, maybe so, but we’ve still got Community X.  They don’t do news there.

Editor #1: Maybe we’ll build a whole website just based on news from Community X! It’ll be awesome! Yeah, we’ll get, what do they call it?

Editor #2: Hyperlocal.

Editor #1: Right.

And so the hyperlocal news sites were born across the country. Some featured original reporting by staff, others were built on the work of citizen journalists. Some have already failed as others have taken on a life of their own.

When the Washington Post – the giant of the newspaper web world – decided to create a “hyperlocal” site based on Loudon County, Va., it was a big deal. Of course, their idea of hyperlocal was a group of loosely-connected communities instead of the communities themselves – but they’re the WaPo, if they want to call it hyperlocal, they can. Two years later, the  WaPo announces its closure of LoudounExtra. Sure, the post says, they’ll still COVER the area, but it won’t have its own website anymore.

About a year ago, the Wall Street Journal saw this coming, charging that the WaPo didn’t understand what it meant to be hyperlocal in the first place. I’m inclined to agree. What I see from a lot of big news outlets is a page collecting their stories on the area and little more – that isn’t hyperlocal coverage – it’s a hyperlocal aggregate feed.

What makes a good hyperlocal site isn’t just collecting a bunch of stuff about that area and throwing it up on a web page – it’s about understanding the community on a ground level. It helps to live there, but merely getting out there and getting to know people is a start. From what the WSJ post said, the staff at LoudonExtra wasn’t very invested in the area:

To penetrate those communities requires a more dedicated effort than the LoudounExtra.com team was putting forth. [The manager of the project] acknowledged he spent too much time talking to other newspaper publishers about the hyperlocal strategy and too little time introducing his team and the site to Loudoun County.

Whether that is ultimately why the site didn’t get enough traction to remain independent is a leap I won’t take – but it certainly would make sense. The WaPo, while it does serve a local audience in addition to its wide national base, may not be the experts at knowing what’s going on in Middleburg, Va. Who does? People on the ground in Middleburg, that’s who.

The best local-local writers are invested at a micro level. For instance, Mission Local, a neighborhood news site created through a hyperlocal news project out of UC Berkeley’s Graduate School of Journalism. Their site has news important to those living in the area – stories of all sorts, a police blotter, maps. If you check out their About page, you see that the publication is based in the Mission District and many of the writers are residents there.

Another great example, the West Seattle Blog, is a husband-and-wife team focused on a very specific part of a larger city in which they live. I had the opportunity to meet them and hear about their operation when I was a fellow at the Knight Digital Media Center in March. They both have backgrounds in journalism and took that expertise to cover their own neighborhood. As a result – they regularly publish what’s going on before their local metro.

Their crime page keeps a running tally from scanners and crime reports from residents. They have community-level announcements that come in from submissions. In addition to their own writing and reporting, they also have a selection of news and opinion from other bloggers in their area. All in all, they have a lot of content – all local (or hyperlocal!).

Even if there isn’t a person physically on the ground in the neighborhood, it takes knowing what people want to see from their area and how specific they may want it to be. “Drilled down” news can be done at a larger level – and it has value, if this week’s purchase of  “microlocal” network EveryBlock by MSNBC is any indication.

As Paid Content  said about the sale, EveryBlock had more value than LoudonExtra simply because of its focus on microcosms of communities – not just clumping a whole county together and calling it a community. The Dupont Circle page in EveryBlock is a great example. It has crime report maps, police calls, blog posts and more from a very specific area – pretty useful stuff if you live there – and most of it available from public information.

So the moral of the story is – don’t judge the future of “hyperlocal” news from the WaPo’s failed experiment. There’s gold in them there hills – but we have to actually work at making it accessible and useful.

* Eds Note: For the sake of disclosure, my current paper has a couple incarnations of these products. Cincinnati.Com has more than 100 community-level aggregate sites, including a few with their own discussion forums (and all featuring some pretty nifty maps if you ever want to check them out).

Recommended reading for June 16th – 17th

These are my recommended links for June 15th through June 17th:

Recommended reading for April 24th through May 5th

These are my recommended links for April 24th through May 5th:

Page 2 of 3

Powered by WordPress & Theme by Anders Norén & Hosted by Pressable