Updating a Collections Assessment Rubric

by Kathleen Reed, Assessment and Data Librarian, Instructor in the Department of Women’s Studies, and VP of the Faculty Association at Vancouver Island University

Last year, I wrote a blog post that mentioned the rubric my place of work (MPOW) uses to assess collections. The rubric is a collaborative document designed by colleagues Jean Blackburn, Dana McFarland, and me. Recently on Twitter a few people mentioned the rubric again, and made some suggestions for additional items to consider. For this blog post, I thought I’d go over some of these suggestions, and discuss the way the document has been used over five years at MPOW.

The 27-point rubric emerged from a recognition that generic data like cost-per-use wasn’t sufficient in deciding whether to renew or cancel products. We needed a system to look at products in a broader information context. Thus, the rubric was born. In its first five years, it has proven itself very valuable to librarians. We use it when products come up for renewal, often grouping “like” databases together into baskets (i.e. the Big Deals basket, the videos basket) for easy comparison and a more holistic overview. Our liaisons find it useful to use when talking to faculty about potential cancellations.

We haven’t adapted the rubric much, only adding a “required for program accreditation?” question as our institution’s programs expand. But the broader information context in which the rubric sits has shifted, and some new suggestions make sense. Ryan Reiger proposed that an open access lens would be helpful, considering: “OA options for authors, OA percentage, Copyright Override in license, [and] Alternative routes for access.” DeDe Dawson suggested “other friendly license terms such as: support for text & data mining, and no non-disclosure agreements.” As increasing numbers of librarians critique vendors, emphasize open access, and demand transparency, these suggestions make sense to be added to the rubric v2.0.

Thanks to Ryan and DeDe for sharing their thoughts on the rubric. If you have ideas on how to improve this tool, feel free to leave them in the comments below.

(Editor’s note: Brain-Work is hosted by the University of Saskatchewan and there is a problem with the comments that cannot be resolved. If you try to comment on this or any blog post and you get a “forbidden to comment” error message, please send your comment to virginia.wilson@usask.ca and I will post the comment on your behalf and alert the author. We apologize for this annoying problem.)

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

2 thoughts on “Updating a Collections Assessment Rubric

  1. What a great idea! We have refined our renewal and request process as well, but this takes it down to a fine science. One thing I noticed is that you don’t note subject liaison comments; is that something you would take into account (for example, a database that may look like it’s receiving low use, but is critical to a small area; liaison begs to retain it for one more year; etc) or is this used solely for the big, multidisciplinary packages?

Leave a Reply

Your email address will not be published. Required fields are marked *