Image via Wikipedia
OK, our future is going to include citizen journalists in our communities contributing side by side with professional reporters, but sometimes well-meaning folks get stuff wrong or they’re reporting unverified information. How do we separate rumor and myth from reliable, relevant content?
(Don’t get your knickers in a knot. I’m not suggesting the pros always get it right. We are paid to make the effort, however.)
Dan Schultz, writing Sunday over at MediaShift Idea Lab, has a plan. He envisions a five-step process that pairs the critical thinking of the community collective with computer smarts. It looks like this:
“Technique 1: Purgatory – New articles of any type will start in a section of the site dedicated to unchecked information. This content will not be ‘elevated’ to the mainstream area until it has been collectively rated and categorized, and meets a certain quality threshold. By placing content here the users’ critical abilities will be explicitly triggered, they will be reading the content specifically to judge it.
“Technique 2: Context – The system’s tagging process will make it possible to display potentially related articles for curious readers. During article purgatory this will help inform critical ability; a lone report about a huge explosion in Montana might not be credible, but seeing that there are 500 of them alongside links to a breaking story from the AP would make the piece much more believable.
“Technique 3: User history – Has the user contributed anything in the past? What is the average quality of those contributions? Has the user tended to write opinion or report pieces? The system can provide this information to readers, once again in the name of empowering critical ability.
“Technique 4: Intelligent systems – Spam is automatically caught by mail and forum filters all the time. Although our situation will still require human input, the system could flag particularly suspicious-looking or particularly good-looking content in order to help guide purgatory readers.
“Technique 5: Targeted moderation – Since people will define topical and geographic interests, new articles can be targeted during the moderation process. This would mean that Philadelphians would have higher clout when judging a story that is relevant to Philadelphia and that those who like nanotechnology would be more trusted to review the latest report on the nano-bot 5000.”
My first reaction was, wow, how time-consuming. Not necessarily, Schultz argues. “This probably all sounds like a lot to ask of Joe User, but it actually isn’t so bad. It will just involve spending a minute or two reading an amusingly bad or refreshingly good article about a topic that is likely to have been targeted (i.e., of interest) to them.”
A wiki-type interface for the ninja communities has been one option from the beginning. I believe Schultz’s system just might work with a wiki. What say you?
Oh, and lest anyone think it’s all doom and gloom in the journalism biz these days, Paul Bradshaw and his Online Journalism Team have started JollyJournalist.com, which celebrates all the reasons it’s a great time to be a journalist.
Bradshaw and team even posted their Top 10 list of reasons on the blog. You can add your reasons at JollyJournalist. Go on. Spread the enthusiasm.
- Can You Run an Online Publication? [via Zemanta]