Tuesday, May 26, 2009

Assistance in Evaluating Digital Scholarship?

As a follow up to my Faculty Review System FAIL post, an article by Scott Jaschik entitled Tenure in a Digital Age appeared in today's Inside HigherEd.

The article highlights a consortium called the Humanities, Arts, Science and Technology Advanced Collaboratory that have produce a draft guide (on a personal Web site, not the consortium's) that offers some guidance for departments on approaches used by various colleges to evaluate digital scholarship, resources available to scholars wanting to get a take on some project, and policies that could be adopted to assure the fair treatment of those coming up for review.
One reason for the new effort is that shifts in publishing may make it impossible for a growing number of academics to submit traditional tenure dossiers. With many university presses in financial trouble and others -- notably the University of Michigan Press -- turning to electronic publishing for monographs, there will be fewer possibilities for someone to be published in the traditional print form that was once the norm for tenure.
The article notes that the shift that is occurring isn't just about the digital scholarship, but about how tenure committees being forced to learn much more about candidates and how their work was evaluated than has been the norm.
So many tenure decisions have been made on the basis of assuming that a university press has a sound peer review system -- and one that can be relied upon -- that tenure has been outsourced, some say. Now, new models of scholarship are forcing these committees to closely consider how they know a candidate is producing good work.
The article also echoed my comments about how activities that do not fit neatly into teaching, research, or services get categorized as service:
Many tenure review procedures are based on an assumption that a junior professor's work can be divided easily into teaching, research and service. Feal noted that one of the exciting aspects of the new digital projects being created is that they advance scholarship and create teaching tools at the same time. Professors shouldn't be forced to pick between one category and another. Similarly, those involved in this project say that some college departments just categorize anything digital as service, a solution seen as unsatisfactory because many of these project are in fact focused on scholarship and teaching, and because service typically doesn't count for much in tenure reviews.
I haven't had the time to look over the HASTAC documents at length since I wanted to communicate their availability. I am very interested in seeing how they could be applied or modified for our library faculty. Sphere: Related Content

Friday, May 22, 2009

Faculty Review System FAIL?

One of the issues that our AP&T committee is trying to get a handle on is how we can make sure that our criteria reflects the current and changing models of scholarship occurring both within the library profession and the academy.

In Talk About Talking About New Models of Scholarly Communication, Karla Hahn, Director of the Office of Scholarly Communications at the Association of Research Libraries (ARL), helps to define scholarly communication. It is:
.. knowledge transmission—even if it is simply passing information from one brain to another through speech, e-mail, submission to a database, the display of an image or video, or through a formal writing and printing process. In contrast, scholarly publishing is a subset of communication activities mediated through the use of a durable medium to fix knowledge
The traditional definition of scholarly communication is the publication of monographs and journals. This has served as a useful model since traditional publication can be clearly distinguished from other communication practices. Our library faculty as a whole conforms to this conventional definition of scholarly communication. Even so, a growing number of our faculty feel that new models of scholarly communication are not only just as valid as the traditional forms, but are critical for real-time knowledge transmission.

The obstacle to pursuing new forms of scholarly communication appears to be a rewards systems that still places a major emphasis on traditional publishing models. Yes, we need to be careful not to rebalance priorities in a way that devalues traditional scholarship. We also need to strike a balance so that alternative forms of scholarly communication - scholarly activities in general - are supported and rewarded as scholarship, but not at the expense of traditional scholarship.

The University of California's Office of Scholarly Communication issued a paper entitled Faculty Attitudes and Behaviors Regarding Scholarly Communication. They observed:
The majority’s lack of motivation to alter behavior appears to be connected... to the tradition-bound tenure and review process... the current tenure and promotion system drives them to focus on conventional publishing activities...Assistant Professors show consistently more skepticism about the ability of tenure and promotion processes to keep pace with or foster new forms of scholarly communication.
The groundswell for changes in how scholarly activities are defined and evaluated is growing. The thinking is that scholarly activities must not be judged on traditional review and distribution methods but according to (appropriate) standards for significance, excellence, and impact. The problem is that no one can seem to get an agreement on what those standards should be. At what price?

Boyce D. Watkins, an assistant professor of finance at Syracuse University is reacting to Syracuse's decision to deny him tenure and let him go by saying he had been led to believe that the university's standards for judging faculty publications had changed, putting less emphasis on refereed journals. Watkins points to SU Chancellor Cantor's efforts to encourage more faculty engagement with the public and interpreted Chancellor Cantor's call for "scholarship in action" as giving him the green light to focus on publishing and publicizing his work in the mainstream media.

As a growing number of scholarly activities depart from established academic patterns, review committees simply do not know what to do with them. Since they don't fit neatly into one of the three legs of facultyness - teaching , research, and service - such activities are commonly classified as service. This generic classification is, of course, problematic since all emerging forms of scholarly communication are being dumped into this category and assessed as service, not as scholarship.

When it comes to review time, as Professor Watkins experienced, innovative scholarly activities are not given the weight they deserve. We simply fall back into a practice of classifying activities in one of the three legs, even as those classifications no longer make sense. We then fall back into the thinking about traditional communication models as being the only ones that really have any value.

As UC observes, committees need to begin treating value as intrinsic in the work and its use, rather than predetermined by how it is classified. Review committees need to remove this presumption in the evaluation process in order not to prejudge what they are evaluating. Scholarship other than publication needs to be assigned a greater value. The teaching and service categories need be renovated to make contributions in these categories a potential basis for tenure or promotion.

After all, where will the motivation to become more innovative scholars, to be involved in more interdisciplinary endeavors, or to engage in new activities come from if faculty must conform to rewards systems that reflects a bygone era? (see: Copyright Law) Faculty review cultures and processes which have been perfected for traditional scholarship need to be replaced with structures optimized for digital scholarship.

Are we really trying to fit a square peg into a round hole here? Can we really expect to break out of the existing paradigm of how we define and assess scholarly activities if we limit our thinking to fitting into the existing system?
Sphere: Related Content

Monday, May 18, 2009

#MLA09 Tech Trends Panel Video #mla09tech

I am participating 'live' on a tech trends panel session today at the Medical Library Association Conference underway in Hawaii. You can follow the panel on May 18, 2009 at 4:30pm (east coast) via TweetChat or the CoverItLive pilot.

Sphere: Related Content

Thursday, May 14, 2009

MLA 2009 Cover It Live Feed #mla09

I am playing with Cover It Live for the Medical Library Association annual meeting. I have a feed set up for the Tech Trends panel but decided to test it on a whole conference feed.

Sphere: Related Content

Monday, May 11, 2009

Duped by Elsevier? Run to Your Local Library. NOW!

I was one of those that expressed outrage last fall over the Journal of Access Services decision to devote an entire issue to the postings of an anonymous blogger.  I felt the decision by the editor and publisher to lower their quality standards exposed a crack in the foundation of scholarly communication. Another controversy has emerged in the past few weeks.

Between 2003 and 2005 Elsevier produced several issues of  Australasian Journal of Bone and Joint Medicine. The controversy is that it has been recently exposed that the Elsevier was paid by Merck to publish the 'journal' which contained favorable data about Merck products Fosamax and Vioxx - without any disclosure of Merck's involvement. 

The Scientist obtained two PDF issues of the journal: Volume 2, Issues 1 and 2, both dated 2003. Other titles published by Elsevier and paid for by Merck include the Australasian Journal of General Practice, the Australasian Journal of Cardiology and the Australasian Journal of Cardiovascular Medicine. 

An interesting analysis at the Chemistry Blog revealed that 63% of the articles were favorable to Merck.

There is plenty of outrage over the ethics in this case. Elsevier deserves all the negative press. I am not going to pile on.  Instead, after looking over the issues I believe there were plenty of clues that the publication was not scholarly in nature, regardless of the title and the peer-review look. We all should be concerned if scientists were actually duped into thinking the journal was scholarly communication:
  • The journal is not indexed in MEDLINE. Sure, many journals are not in the database, but it is still the premiere discover tool and any journal of value would be in it.
  • It's not in ISI Web of Knowledge. Ditto.   
  • Most scientific research journals include financial and funding disclosures. The lack of them should raise a few flags. 
  • How many 'review' articles cite only one or two articles?  
  • The editorial board is 'honorary.' 
  • There were no 'instructions for authors.'
  • Many of the 'articles' had NO authors. 
  • Most science/tech articles these days have a DOI.  
  • The journal had no website.
I'm sure there are many more. 

I'm unsure if scientists were actually duped. Chances are that most are more concerned over the idea that this occurred then whether or not any life-changing decisions were actually made based on articles in the advertorial. 

Any scientist or professional that was duped needs to run - not walk - to their local library. Ask for the librarian on duty.  Ask them to explain how to evaluate scholarly journals.    
Sphere: Related Content

Monday, May 04, 2009

Participate in the MLA09 Tech Trends Panel - Virtually. #mla09tech

I will be a virtual presenter at the Technology Trends Panel at the 2009 Medical Library Association Conference. One of the topics I will be covering will be virtual conference attendance

As a part of my research, I decided to play around with a service called CoverItLive  I first used the CiL service to virtually attend the 2009 Computers in Libraries conference. (yes, both use the same CiL acronym). 


CiL is designed to provide a live stream from any event. Organizers can select any number of authors to provide live 'blogging.' Participants can then comment on those postings. Twitter streams can be imported into the CiL feed based on specific accounts or custom hash tags. This is nice since it does not require participants to post at two sites. Other useful features include including embedding videos, asking poll questions, and the archiving of CiL sessions.  

As a participant, there is really nothing to do during a CiL live event other than read, watch, and occasionally send a comment or vote. 

I decided to set up a CiL site for the Tech Trends Panel as a pilot. It will make use of Twitter updates that use the hash tag #mla09tech. I will also embed the two videos presented and any additional resources.  

Feel free to sign up for the session with a reminder below. It is also important to note that the only support for the site will be five zones away from the conference. 


Sphere: Related Content