In the post I highlighted several issues which are barriers for academics. One of the concerns was quality indicators. How does one quantify a blog's impact?
Walt Crawford's thoughtful approach to identify the "reach" of librarianship oriented blogs provides a few interesting ideas. Walt's study is a followup to his 2005 study. He comes up with several Top lists. Walt is very clear that his list is not the Top 50, but a Top 50 which was primed with his own Bloglines subscriptions. With over 554 liblogs to work from, Walt "drained the pool" based on the number of subscriptions and the number of links found in Google and MSN Search.
The problem with today's environment is that there are so many aggregators and search tools indexing blogs that pulling together information to make such an analysis is extremely time consuming. I do not know how much time it took Walt to compile his study, but I will assume it was much more than anyone could suspect.
The following are some of the criteria used in his study and my comments:
- Frequency of Posts. The frequency of posts by a blogger is a topic that has had some recent discussion, sparked by a post by Eric Kintz. As Kintz points out, frequent posting creates the equivalent of a blogging landfill. According to Technorati, only 11% of all blogs update weekly or more. While an interesting stat, frequency does not provide much as a quality indicator as much as it indicates some level of proficiency.
- Comments. The number of comments on a post does provide some insight into which posts are hot topics or hit a particular nerve. Walt refers to comments as "Conversational Intensity." The theory here is that interesting or controversial posts will result in a higher number of comments.
As Walt points out there are blogs that do not have comments activated, which causes some problems. A concern I have is that within certain communities a core group of bloggers will comment on each other's postings, which is similar to citing a friend's work. Authors will also respond to each comment posted. Both these behaviors will artificially inflate the comment total.
Still, I view comments (and topic spawned posts) to be the blogging equal to peer-review. In many respects, this post-review comment process may be more critical and may advance concepts further and faster than traditional peer review and publication process. The challenge is encouraging"quality" comments and getting buy-in from the academic "traditionalists."
- Length of Posts. Posts longer than the average of 268.5 words were classified by Walt as "essays" and those less than a quarter of the average as "terse." The question is which approach has more impact. Since I am discussing scholarly communication I would propose that essays would have a higher impact. However, sometimes a three page article will have more of an impact that a 50 page article.
I would like to thank Walt for his analysis since the issue of blog quality indicators and impact factors is an issue I am very interested in now that I am on the local promotion and tenure committee. As with any such analysis, coming up with a set of metrics in an effort to identify quality is a challenge. This is certainly a great stepping off point for future discussions.
I will have to wait and see how many comments and links this essay receives.
Michael Stephens. Evaluating LIS Weblogs
DMOZ/Open Directory Sphere: Related Content