Is It Possible To Develop An Evidence-Based Complement Plan?

by Frank Winter
Librarian Emeritus, University of Saskatchewan

Although not typically phrased as such, librarian labour – what it is, how much of it a library has, how best to deploy it – underlies the ongoing discussion of how best to deliver the services needed by the host institution. Opinions abound on what is needed and what is desirable. Proposals for new or modified roles and services are often received as something that can only be achieved using incremental resources rather than by internal reallocation, a stance based on the oft-voiced assertion that each member of the current complement is already more than fully occupied with existing responsibilities.

The common unit of analysis in these discussions tends to be the position held by an individual librarian, considered as an indivisible worker. But any position and its associated responsibilities is made up of a bundle of tasks. Considering workload and complement in a position-based sense can forestall systematic discussion about options for assigning or reassigning tasks up, down, sideways, out (in terms of partnerships and collaborations, and/or outsourcing, and/or assigning to a different group of employees in the library or elsewhere on campus), redefining the task, utilizing technological options, or not doing the task at all. These are all part of the standard toolkit when dealing with short term situations such as leaves and vacancies and perhaps longer term situations such as downsizing but seem not to be typically part of the discussion when discussing longer term complement plans.

Complement plans are assembled from many component parts. Although there is typically a great deal of professional judgment that goes into complement planning, it is often individual, implicit, and fraught with the usual power dynamics of any group process and all the other pitfalls of planning and decision-making.

Is it possible to employ the processes and tools of evidence-based library and information practice (EBLIP) to develop a complement plan that would address some of these challenges and produce a robust planning document? A quick review of the relevant evidence-based literature suggests that such an approach has not yet been reported but might be productive.

What would such a process look like using the 5 As (Articulate, Assemble, Assess, Action, Adapt and the interactive process of their use) outlined by Denise Koufogiannakis (2013) together with her description of what types of evidence are typically considered in library research as well as the “institutional, group-driven decision making” framework typical of library organizations? Constraints of space make a full discussion of each A impracticable but a quick sketch might be helpful as a starting point.

* Articulate

Koufogiannakis sketches out several points but it is important to recognize that a complement plan addresses the allocation of one subset of a library’s resources – librarian labour. As with every proposed resource allocation it is a political document incorporating budget choices that reflect values.

* Assemble

There is a wealth of riches with respect to potentially relevant evidence. Many of the sources would typically be included in the Environmental Scan section of any Strategic Plan. What EBLIP provides is clarity of purpose in the Articulation stage and focus in assembling evidence at this stage. If the assembled evidence does not, at the Assessment stage, reveal enough about the librarian labour involved, then the evidence-based approach requires an iteration of this stage.

* Assess

Assessing the evidence is the next step in EBLIP. The standard criteria of credibility and validity apply as well as issues of relevance and context. Ensuring that at the Assemble step there is as much depth, breadth, and context as possible in the assembled evidence will aid in assessment. Transparency and inclusivity during the discussions are also important elements at this stage.

For example, although evidence from comparator libraries is often considered it is actually quite tricky to find true comparators. It is very important to be very aware of similarities and differences and what specific tasks and responsibilities are included and not included and the extent to which they might be distributed among others in the library and on campus. It is not particularly helpful to assume what any library or librarian is doing based on what is described on home pages or position titles. The arbitrariness of organizational structure on campus and within libraries sometimes makes it challenging to map apples to apples. At a minimum, personal contact should be made to ensure that the full situation is known. On the other hand, if a comparator library with approximately the same complement of librarians and roughly the same organizational mission is responsible for services not supported by the local library, then further investigation is needed to discover how that other library distributes the responsibilities among their librarian complement. If a smaller university library delivers the same or even an expanded array of librarian-related services then that, too, merits further investigation and perhaps further iteration of the Assemble stage.

It is necessary to assess the potential impact of the evidence on “the Library” and the librarians. Impacts range from measurable and substantial through to insubstantial and unmeasurable.

Evidence from existing librarians must be weighed to distinguish anecdotal empiricism and self-interest from credible evidence.

Another step to take at this point is to be clear about the appropriate unit of analysis when assessing evidence. It is not helpful to view “The Library” – either local or comparator – as an undifferentiated lump. It is more appropriate to disaggregate “The Library” into a bundle of things (work groups including librarians, physical locations, and so on) responding to differing user needs. This step will help in the assessment of what works and what won’t and why. What might work in one area of a library might not be appropriate in another. This avoids the trap of trying to find one size that fits all.

* Action

Getting to agreement is obviously another critical step in the development of a complement plan. Koufogiannakis describes a number of criteria but it is her articulation of the outcome of this step that is important: Determine a course of action and begin implementation of the decision. If no action results from the work above (and acknowledging that a considered conclusion that no changes are desirable is a possible outcome), then arguably the process has been pointless.

In this respect, it is interesting to read the recent blog posting by Roger Schonfeld entitled Shaping a Library by Linking Planning and Budgeting, and the associated comments (2016). Even for the largest libraries, librarian complement is typically a slowly evolving resource if viewed as being composed of positions. Alternatively, for smaller academic libraries changing just one position can be a major and rare action in the overall composition of the complement. The Schonfeld posting highlights librarian time – a more fungible resource than positions – as the productive unit of analysis.

* Adapt

Have the goals and outcomes of the process resulted in what was anticipated to be their effect – the allocation of librarian labour to most effectively meet the current and emerging information needs of library users? If not, why not? At least one possible outcome at this stage (very much institution-dependent) is a conclusion that there is a diminished need for librarians labour. If this is the case, it makes for a pretty gloomy complement plan going forward. And so, the planning cycle returns to the Articulation stage.

In conclusion, the 5 As of EBLIP in addition to the collegial decision-making style typical of libraries seem quite suitable to the development of useful librarian complement plans.


Koufogiannakis, D. (2013). EBLIP7 Keynote: What we talk about when we talk about evidence. Evidence Based Library and Information Practice, 8(4), 6-17 doi:

Schonfeld, R. (2016, November 7). Shaping a library by linking planning and budgeting [Blog post]. Retrieved from

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The “Why?” behind the research

by Andrea Miller-Nesbitt
Liaison Librarian, Schulich Library of Physical Sciences, Life Sciences and Engineering, McGill University

Lorie Kloda
Associate University Librarian, Planning & Community Relations, Concordia University

Megan Fitzgibbons
Innovation Librarian, Centre for Education Futures, University of Western Australia

When reading journal articles reporting original research, content usually follows the IMRAD format: introduction, methods, results, analysis, and discussion. Word count, author guidelines, and other conventions usually mean that the researchers’ motivation for conducting the study are often left out. In this post we present our motivations for conducting a research study on librarians’ participation in journal clubs:

Fitzgibbons, M., Kloda, L., & Miller-Nesbitt, A. (pre-print). Exploring the value of academic librarians’ participation in journal clubs. College & Research Libraries.

Being an evidence-based practitioner can sometimes involve a bit of navel-gazing. Beyond using evidence in our professional work (e.g., for decision-making, evaluating initiatives, etc.), we may likewise ask questions about the outcomes of our own professional development choices.

After three years of facilitating the McGill Library Journal Club, we began to think about ways we could disseminate our experience and lessons learned, and most importantly, how we could determine librarians’ perceived outcomes of participating in a journal club. We felt anecdotally that participating in a journal club is worthwhile, but we wondered: Can we formally investigate the impacts of participation on librarians’ practice and knowledge? What evidence can we find to inform the decisions and approaches of librarians and administrators in supporting or managing a journal club? Is there a connection between journal clubs and evidence-based librarianship? We also wanted to learn more about approaches taken in a variety of journal clubs and how they define success for their group.

The McGill Library Journal Club was initially established in order to help foster evidence-based practice by reflecting on the library and information studies literature and using those reflections to inform practice. The journal club also provides a professional development opportunity for all those interested. Although the McGill Library Journal Club has experienced many of the same challenges as other journal clubs, it is still going strong after 6 years thanks to a core group of motivated facilitators. (For more information about the journal club’s activities, see the McGill Library Journal Club wiki.)

In order to answer these questions, we first had to agree on a definition of a journal club. After some reading and deliberation, we framed participation in a journal club as an informal learning activity: learning that occurs outside classrooms or training sessions, but still involves some coordination and structure. In this context, our research question was: “What do librarians perceive as the value of participating in a journal club?” We focused on academic librarians who participate in journal clubs to manage the scope of the study, but a similar approach could be taken in other library and information organizations as well.

Because we were interested in gaining insight into individuals’ experiences, we considered several methods, and ultimately selected an in-depth qualitative method, the hermeneutic dialectic process (Guba & Lincoln, 1989). This is a method that we have seen used in the social sciences for the purpose of evaluation and reconciling diverse perspectives. At the time we were coming up with our research question, one of the authors (Lorie) was an assessment librarian, and interested in qualitative methods. She brought Guba and Lincoln’s writing to the team for discussion. It seemed both appropriate for answering our research question and also flexible to enable us to be able to really capture study participants’ experiences – not just what we expected to hear. We believe that this is the first use of this method in LIS research, so an additional motivation for the study was to apply the approach in the field.

As per the method, we conducted semi-structured in-depth interviews with each participant. After the first interview, central themes, concepts, ideas, values, concerns and issues that arose in the discussion were written into an initial “construction” which captured the experiences and perceptions expressed by the interviewee. Then in the second interview, the participant was asked to react to some of the points brought up by the first interviewee, as expressed in the construction. The construction was added to after each interview, incorporating the perspectives of each successive interviewee and used to inform the subsequent interviews. At the end, all participants were given the opportunity to comment on the final construction and let us know whether their perspectives were accurately represented.

Ultimately, we believe that the findings of our published study are of interest to librarians who aim to create and sustain a journal club. In particular, it could offer insight as they form goals for their group and justify the activity, including seeking support and recognition for their group.

More details about the impacts of academic librarians’ participation in journal clubs are of course presented in the article. In addition, in collaboration with the C-EBLIP Research Network, we hope to compile additional resources about journal club practices in librarianship and open communication channels in the future. Watch this space, and please get in touch if you have any ideas about promoting journal clubs for academic librarians.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Digital humanities and the library: Where do we go from here? C-EBLIP Journal Club, October 6, 2016

by Carolyn Doi
Education and Music Library, University of Saskatchewan

Article: Zhang, Ying, Shu Liu, and Emilee Mathews. 2015. “Convergence of Digital Humanities and Digital Libraries.” Library Management 36 (4): 362-377.

In October, the C-EBLIP Journal Club met to discuss an article focused on the evolving domain of digital humanities and its role with the academic library. The article in question, “Convergence of Digital Humanities and Digital Libraries” was published by Zhang, Liu and Matthews in Library Management, a journal that aims to “provide international perspectives on library management issues… highly recommended for library managers.”1 The article discussed ways that libraries might support scholarship in digital humanities (DH), digging into aspects of content, technology, and services that the library might develop for digital humanities scholars. I was compelled to select an article that addressed this subject, as I recently attended a web broadcast of the “Collections as Data” livestream where DH and librarianship were discussed together several times2, leading me to consider my own background in musicology and librarianship and how they might overlap through a digital humanities lens.

The members of the journal club chose to assess the article in question from a few different angles: context, audience, methodology, and findings, and conclusions. Our discussion of the article was aided by use of the EBL Critical Appraisal Checklist.3 Developed by Lindsay Glynn, this tool is made up of a series of questions that help guide the reader through assessment of the study including: study design, including population, data collection, study design, and results.4 We found that using the checklist allowed us to think critically about each aspect of the study design, to assess the reliability, validity, and usability within our own professional context. A summary of our discussion is presented below.

Context & Audience

During our conversation, we noted that this article is aimed at library managers, or those who may be in an administrative role looking to gain a quick picture of the role of libraries in interacting with digital humanities scholars. It was noted that the link between libraries and digital humanities has already appeared in the literature on many occasions, and that to get a fuller picture of how libraries might approach this collaborative work, reading other critical opinions will be of utmost importance. One may want to consult the list of resources provided by the dh+lib folks, which can be found on their website, to get a sense of some of the core literature.5


The methods section of this article describes how the researchers consulted various evidence sources to identify current challenges and opportunities for collaboration between DH and libraries. In this case, the authors state that they have combined findings from a literature review and virtual and physical site visits to “humanities schools, research centers, and academic libraries.” The databases were shared, though search terms were not. We felt that including this information would be helpful both for assessing the quality of the search and for other researchers hoping to replicate or build on the review. The search resulted in 69 articles, 193 websites, and 2 physical site. While discussing the validity of these evidence sources, we felt that while the literature and online site visits may provide a more representative selection of sources to draw conclusions from, the sample of physical sites was not large enough for sufficiently precise estimates.


Zhang, Ying and Mathews’ findings include both challenges and opportunities for collaboration between DH and digital library communities. Description of how the evidence was weighed or analysed to retrieve these results was not clearly outlined in the paper, and we felt that including such information would assist the reader to evaluate the usefulness and reliability of the findings. A summary of these findings is provided in the accompanying chart.

Challenges Opportunities
• “DH is not necessarily accepted as qualifying scholarship… novel methodologies and the theoretical assumptions behind their work have been questioned by their peers from traditional humanities schools of thought.” • Creating “knowledge through new methods”
• “The DH community has unbalanced geographical and disciplinary distributions… Related DH collections are not yet integrated. These digital collections are distributed in different schools, academic units, museums, archives, and libraries. Few efforts have been made to link related resources together.” • Working “across disciplines [that] are highly collaborative”
• “The technologies used in DH create barriers for new scholars to learn and for projects to be sustainable” • Producing a “unit of currency…[that] is not necessarily an article or a book, but rather, a project…usually published using an open web platform, allowing users to dynamically interact with underlying data,”
• Establishing “major scholarly communication, professionalization, and educational channels”


In the conclusion of the article, Zhang, Ying and Mathers present a positive perspective on the opportunities for collaboration between the DH and library community: “To make collaborative work more successful, we, LIS professionals, need to challenge ourselves to continuously grow new skill sets on top of existing expertise and becoming hybrid professionals. The DL community should strive to make ourselves more visible, valuable, and approachable to the DH community. Even better, the DL community need to become part of the DH community.”

On this point, the journal club’s conversation focussed on the capacity of libraries to take on these new collaborations, and whether we are necessarily prepared for such projects. These thoughts are echoed by Posner, who writes in her article, “No Half Measures: Overcoming Common Challenges to Doing Digital Humanities in the Library” that “DH is possible in a library setting…but that DH is not, and cannot be, business as usual for a library. To succeed at digital humanities, a library must do a great deal more than add ‘digital scholarship’ to an individual librarian’s long string of subject specialties.”6

The domain of DH is compelling and creative: it incorporates new methods, produces innovative means of dissemination, and combines diverse perspectives on research. Libraries are well positioned to contribute to this domain, though exactly how this should or can happen is not found in a one-size-fits-all answer. Zhang, Ying and Mathers present some good points that may serve to begin a conversation on how libraries and DH folks might work together. Further research on each of these points is up for further investigation for the librarian or administrator aiming to implement these strategies in their own institution.

1“Library Management.”

2Library of Congress. “Collections as Data: Stewardship and Use Models to Enhance Access” September 27, 2016. Accessed November 4, 2016:

3EBL Critical Appraisal Checklist.

4Glynn, Lindsay. “A critical appraisal tool for library and information research”, Library Hi Tech 24, no. 3 (2006): 387 – 399.

5“Readings” dh+lib. Website. Accessed November 4, 2016.

6Posner, Miriam. “No Half Measures: Overcoming Common Challenges to Doing Digital Humanities in the Library.” Journal of Library Administration 53, (2013): 43-52.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

A small experiment to improve Facebook engagement

By Joanna Hare
Run Run Shaw Library, City University of Hong Kong

As I am sure is the case at many academic libraries, I am the sole person responsible for maintaining the Library Facebook Page. This means that a lot of my time is spent planning and scheduling content, with not as much time as I would like spent collecting evidence for the purpose of improving content. I regularly check and download Facebook Insights reports to keep an eye on how our page is doing, and of course I always pay attention to how much interaction a particular post is getting through comments, likes, or shares. Recently, however, I trialed a small experiment to see if I could improve the performance of a particular type of post: a weekly link to the Library’s Books of the Week blog.

Books of the Week is a place to share recommended Library books, usually related to a current event such as the Olympics or the beginning of semester. In the past, a feed was created so all new blog posts would be automatically posted to the Facebook page. This was causing a number of problems, such as the timing and number of posts becoming unpredictable, and the posts being poorly formatted in Facebook. Most importantly, the Facebook posts coming automatically from the blog were getting zero engagement, and the Reach of the posts was very low. A change was clearly needed.

I decided to stop the blog posting automatically to Facebook, and manually post the item myself. I created a simple graphic to be used each week, and posting manually meant I could write the accompanying status to be more timely and unique. Even though manually posting the item each week only takes a few minutes, in terms of my job description and job performance I knew I would need to justify if this increased manual work was worth the effort.

Based on an experiment described in this article, I started a log of the variables when posting Books of the Week each week. The log included a link to the post, a description of the post such as the image dimensions, length of the accompanying status, and the time and date of the post. Then, each week I recorded the basic units of measurements for the post provided by Facebook: the Reach and the Post Clicks. I was less interested in likes, comments, and shares in this instance – of course I kept a record of them in my log – but metrics like Reach and Post Clicks are sufficient to see if people are engaged with your content without taking the extra step to ‘like’ a post: “…just because someone didn’t click on your specific post, if that post encouraged them to click anywhere else on your page, you’ve done a good job!” (Cohen, 2014)

For the first four weeks, I saw marked improvement in terms of the Reach, rising from 43 in the first week to 185 by the fourth week. At this point, I tweaked the method of posting. Rather than posting a link then adding the graphic as an attachment, I posted the graphic as a photo, with an html link in the description. Crucially, after digging into my Insights reports I found Facebook categorises the first type of post as a ‘Link’ and the second type as a ‘Photo’. The difference is very small in practice, and looks like this:

Fig 1: Image on the left shows a ‘Link’ post type. The second image shows a ‘Photo’ post type.

After making this change, the increase in the post’s Reach was remarkable – the figure jumped to over 500. Over the next 6 weeks I continued this method of posting, and the posts consistently reached over 800 users. Once in the six week period I reverted to the first method, and the Reach dropped to 166. I returned to the second method, and the Reach increased again, which has remained at or above 800 since I stopped keeping a record of the variation each week.

Much of the literature and the marketing material about using Facebook recommends that page managers use images to engage their audience, so I suppose these results are not surprising. I did not however expect there to be such a difference in Reach simply because my post originated as a ‘Photo’ rather than a ‘Link’, when the content is essentially the same.

The general visibility of the posts was much improved with this method, but the change in the actual click through rate to the blog was less dramatic. On average around 5 people each week clicked on the post. My Insight reports show 2-3 of the clicks were to expand the image or description, while on average 0-1 people clicked the link to visit the blog. Quite disappointing!

Despite this, I do not think the exercise was in vain. Firstly, seeing for myself that images truly do have a greater Reach according to Facebook’s algorithm is useful for all future posting practices. Secondly, I think it is valuable to have our posts become more visible on Facebook, increasing our presence on the platform in general. It seems that the manual effort (which is really only around 10- 15 minutes each week – especially now as my colleagues assist in drafting the text and modifying the image!) is worthwhile given the marked increase in the post’s Reach, and the small increase in people who are clicking on the post. This is just a small scale way of using Facebook Insights, and in future I hope to use Insights more strategically in designing and delivering the Library’s Facebook content. In the coming weeks I will be experimenting with a more coordinated approach to Facebook including a paid advertising campaign, and I look forward to sharing some of the results with the C-EBLIP community.


Busche, L. (2016, February 20). 10 Marketing Experiments You Can Run Yourself to Improve Your Reach on Social Media. Retrieved September 27, 2016, from

Cohen, D. (2014, August 6). Post Clicks, Other Clicks Are Important Metrics for Facebook Page Admins, Too. Retrieved September 27, 2016, from

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Personality types and the 360° survey for professional development, Or “Being a Bulldozer and Still Winning Hearts”

by Tegan Darnell
Research Librarian
University of Southern Queensland, Australia

This short article is about how evidence-based practice applies on a personal/individual level, and how I’m using the outcomes of survey tools for reflective professional development.

As part of an ongoing leadership development program, I have completed a Myers-Briggs Type Indicator® (MBTI), and the slightly less well known 360° Life Styles Inventory™ (LSI). Both are evidence-based and meet rigorous academic and psychometric standards.

Although reluctant to be categorised, I have committed to ‘develop’ my practice and become a better and more effective leader. I endeavour to take what I have learned from this and use it in my practice.

The MBTI® told me I am an ENTJ type, or ‘the commander’, closely correlated with the ‘Fieldmarshal’ in the Keirsey Temperament Sorter (KTS). An ENTJ can be dictatorial, abrasive, and insensitive. Notable ENTJ types include Margaret Thatcher, Vladimir Putin, Steve Jobs, and Gordon Ramsey (the British chef that swears at people a lot).

It isn’t all bad… Basically, an ENTJ is a natural born leader with a ‘forceful’ personality. Also, Intuition (N) types have been shown to have significantly higher ego development (Vincent, Ward, & Denson 2013) – Apparently that is a good thing.

Last time I took the same test I came out as an INFJ, or ‘the advocate’, (the ‘counsellor’ according to the KTS) so my new result was somewhat of a shock. As a committed researcher-practitioner, however, I have to accept what the data is telling me. Quite a lot of time has passed since I last took the full questionnaire…

However, it was the 360° survey that was the most revealing.

In a 360° degree survey not only do you complete a survey about your behaviours, but so does your boss, your direct reports, and peers and colleagues. The differences between your self-evaluation and the perceptions others have of you are revealing.

I am a human bulldozer.

My colleagues rated me as unhealthily competitive, approval seeking, and lacking in actual performance. Apparently I disregard others’ feelings, and come across and cold and insensitive. Positive areas include: My colleagues see me as an independent and unconventional colleague, and I am nowhere near as avoidant as I thought I was.

Sometimes, the evidence does not say what you want it to say. When the data is about a beloved Library service or resource this is hard to take. When the data is about your personal performance and behaviour, this can be particularly difficult to reconcile. But, as will all research, I have asked the question, I have collected the data, and I have the results. Now, with this information, I need to make a decision about what action I take.

An ENTJ appreciates and welcomes objective and rational statements about what they do well and what could be done better. Criticisms, to me, mean: “Here is your challenge. Do whatever is in your power to make it happen”. So, I have accepted this challenge.

Being a bulldozer was never my intention, but if I am a bulldozer, I’ll be a bulldozer with friends, thank you. I’ll be working on my ‘encouraging’ and ‘humanistic’ behaviours, and doing lots of open communication (ie. ‘listening’) over the next few weeks.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Member-Driven Decision Making

by Gwen Schmidt
Manager, Branches, Saskatoon Public Library

How many librarians does it take to change a lightbulb? If you can believe it, there are at least four different answers to this joke. My favourite is “Only one. But first you have to have a committee meeting.”

I have just finished a two-year term as the President of the Saskatchewan Library Association (SLA), following a period of previous involvement on the Board. It has been a fascinating time of change in our organization’s history, with a reshaping of our direction, our governance, and our style of decision making. I was glad to be a part of it.

One of the most interesting things about this time of change was the renewed SLA commitment to a member-driven philosophy. In 2010, the SLA membership determined that our Board structure needed an overhaul, and a Board Governance Task Force was struck. The Task Force took a look at our history, our values, our goals, and the challenges ahead, and set us on a new path with a new vision – central to which was the idea that our members would lead us.

We were always trying to be member-driven, but this renewed commitment to that idea came at a time when cheap/free consultative software tools exist in abundance, and when social media has given individuals the expectation that they can have their say easily. It was easier than ever before to make ‘member-driven decision making’ a reality.

My presidency fell during a time when strategic planning needed to be done. Instead of just doing it at the Board level, we did a broad survey of the members to find out what they think SLA should be doing, what they need from their library association, and what they think we do well. Once we had that data, we also took the opportunity at the annual conference to have conversations with people in person about it. We took broad themes out of the survey data, and had an in-person membership consultation where people could expand on those themes. All of that consultation helped us to build a robust strategic plan that is taking us forward.

During the same time, provincial and territorial and national library associations across Canada were considering the building of a new national federation together, which ultimately became the Canadian Federation of Library Associations (CFLA). SLA was invited to participate. Our member-driven philosophy set a road-map for us: before committing to participation, we took the question to our members. Did they want us to go forward as part of that federation? If yes, within what parameters? Our members gave us a resounding mandate, and endorsed a set of parameters going forward. Consultation with them throughout the process of building the CFLA identified a problem to be solved with a shared Saskatchewan-Manitoba CFLA Prairie Provinces Representative position. Knowing what our members wanted allowed us to set up a Saskatchewan-Manitoba working group, to determine the structure of the Prairie Provinces rep, to ensure strong communication and representation.

In associations, ‘member-driven decision-making’ sounds a little – or a lot – like evidence-based decision making. Instead of doing what we think they want us to do, we ask them what they want us to do and then do that. Those collaborative conversations take time, but ultimately build trust and energy, and give better results in the end.

How many member perspectives does it take to make an association truly shine? A heckuvalotta them. But that makes the future bright.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Gathering Evidence by Asking Library Users about Memorable Experiences

by Kathleen Reed
Assessment and Data Librarian, Vancouver Island University

For this week’s blog, I thought I’d share a specific question to ask library users that’s proving itself highly useful, but that I haven’t seen used much before in library assessment:

“Tell me about a memorable time in the library.”

Working with colleagues Cameron Hoffman-McGaw and Meg Ecclestone, I first used this question during the in-person interview phase of an on-going study on information literacy (IL) practices in academic library spaces. In response, participants gave detailed accounts of studying with friends, moments that increased or decreased their stress levels, and insight into the 24/7 Learning Commons environment – a world that librarians at my place of work see very infrequently, as the library proper is closed after 10pm. The main theme of answers was the importance of supportive social networks that form and are maintained in the library.

The question was so successful in the qualitative phase of our IL study, I was curious how it might translate to another project – an upcoming major library survey that was to be sent to all campus library users in March, 2016. Here’s the text of the survey question that we used:

“Tell us about a memorable time in the library. It might be something that you were involved in, or that you witnessed. It might be a positive or negative experience.”

It wasn’t a required question; people were free to skip it. But 47% (404/851) of survey takers answered the question, and the answers ranged in length from a sentence to several paragraphs. While analysis isn’t complete on the data generated from this question, some obvious themes jump out. Library users wrote about how both library services and spaces help or cause anxiety and stress, the importance of social connections and accompanying support received in our spaces, the role of the physical environment, and the value placed on the library as a space where diverse people can be encountered, among many other topics.

To what end are we using data from this question? First, we’re doing the usual analysis – looking at the negative experiences and emotions users expressed and evaluating whether changes need to be made, policies created, etc. Second, the question helped surface some of the intangible benefits of the library, which we hadn’t spent a lot of time considering (emotional support networks, the library’s importance as a central place on campus where diverse groups interact). Now librarians are able to articulate a wider range of these benefits – backed up with evidence in the form of answers to the “memorable time” question – which helps when advocating for the library on campus, and connecting to key points in our Academic Plan document.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Evidence Versus Intuition (Which is Really, de facto, Evidence)

by Gwen Schmidt
Outreach Coordinator, Saskatoon Public Library

I have never been a really great researcher. When I was in library school, our Research Methods class brought me to tears more often that I would have liked. Surveys and statistics and research papers, bah humbug.

What I am good at is patterns. Patterns in nature, patterns in process, patterns in human behaviour. A really intricate visual pattern will actually make the hair stand up on the back of my neck. I will be entranced. I have always been this way.

Lots of librarians find their way to this career of librarianship because they love books. Don’t get me wrong; I read so many books as a kid that the library was my second home. I still read a lot of books. But what attracted me to the library most is the patterns. Call numbers. Classification schemes. Interlibrary loan processes.

In my 20 years as a professional, I have become a person who “is good at deliverables”, as my last Manager would say. I can build a process that is lean, sensible, efficient, and understandable. I have also become a connoisseur of human behaviour. I enjoy watching the patterns, and I can get a lot done by anticipating how people will behave in certain contexts.

So, when someone says the phrase ‘evidence-based library and information practice’ to me, two things happen: first, I get anxious and hyperventilate about research papers, and surveys, and statistics, and then I stop myself and start to wonder if ‘evidence’ means different things to different people.

I would like to posit that intuition is as important as evidence in decision-making, and that intuition is, in fact, a type of evidence. If you pay close attention every day to the work that you do, your brain starts to see patterns in workflow, in policy interpretation, and in how humans interact with your work. This is the ‘ten thousand hours’ of attention or practice that Malcolm Gladwell talks about in his book, Outliers – the attention and experience that make people really good at something.

Some libraries live by a self-imposed rule that all of their decisions need to be evidence-based, and this often means an environmental scan nation-wide, reading research papers, doing surveys, crunching statistics, and writing reports, all before that decision is made. I would suggest that sometimes there is not enough time to do all of this, and then intuition and years of paying attention need to come to the fore. Neither one is always a better approach, but both approaches need to be in your toolbox.

This is why you might do a bunch of quality formal research before you build a proposal, but you also need to run it past the people down on the ground who work with the processes every day. They can tell you whether or not your proposal is grounded in reality, and whether it will fly or not. They live and breathe where those processes will play out.

Do you need examples to know what I mean? Let’s get granular. At the public library, I have created a lot of programs that resonate with people, and a lot of these I developed using my gut instincts.

I have been programming for years, and, let me say, there have been a lot of duds. Every well-attended or poorly-attended program is a learning opportunity, though, I always say. An opportunity to pay attention. Why did it work? Why didn’t it work? Why do other librarians’ programs work? What are the goals I am trying to accomplish in the first place, and how did this program accomplish those goals or not? What did library patrons say they wanted for programs, but also what programs did they actually show up for? What little things annoy people? Make no mistake: the intuitive approach needs to be fairly rigorous if it is going to work.

If people come to a program, I call that ‘voting with their feet’. After a few years of paying close attention to human behaviour related to programming, and also paying close attention to the things that annoy all of us, the patterns started to emerge for me. Here’s what I know.

Teens are way more engaged in a program if you give them lots of responsibility and make them do all the work. This sounds kind of unbelievable, but it’s true. They do not need us to deliver them fully-formed content to enjoy passively – they can get that from TV or the Internet, and it will always be better than anything we can do. What they need is a challenge or an invitation to create. Since we started to program for teens on this concept, my library has had amazing success with the “Teen Poetry Celebration” (teens write poems), the “We Dare You Teen Summer Challenge” (a literacy scavenger hunt and activity challenge), “Teen Advisory Councils” (teen library club), and most recently the “Book Trailer Contest” (teens make video trailers for books). We get good attendance numbers, and the teens build amazing things.

Other groups of people have patterns too. Most people are too busy to get to a program on a particular date, but they will start to trust you if the program happens repeatedly in a predictable fashion and they don’t have to register. I used to run one-off programs, and sometimes people would come and sometimes they would not. At the same time, a weekly drop-in armchair travel program and weekly drop-in children’s storytimes across the system would attract 20-90 people each time. Why wouldn’t I set up important programs in a weekly, drop-in (no registration hurdles) format? So that’s what we did. We built a weekly drop-in program called “BabyTalk”. Weekly drop-in works for moms and babies, because there is no stress if they miss it, and they can choose to attend at the last minute. I currently run a weekly drop-in program called “iPad Drop-In”, for seniors. The seniors tend to come over and over again, and start to get to know each other. They will also let us teach them things that they would never come to a one-off to learn (e.g. How to Search the Library Catalogue). We get about sixteen people each week with very little effort. It is lean, sensible, efficient, and understandable. The only other thing we need to do is to make sure that we deliver a great program.

These are only a few of the intuitive rules that I live by in my job. Intuition based on watching seniors vote with their feet, watching moms and babies get in the class or not get in the class, teens participate or not participate.

With current developments in neuroplasticity research and the explosion in social media use, there are a ton of popular psychology books out about paying attention, mental focusing, and intuitive decision-making. So, is intuitive decision making a form of evidence-based librarianship? I think so, based on all the patterns I’ve seen.

(I am currently reading “Focus: The Hidden Driver of Excellence” by Daniel Goleman.)

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

EBLIP and Public Librarians: A call to action!

by Pam Ryan
Director, Collections & Technology at Edmonton Public Library / Twitter: @pamryan

As a former academic librarian, I’m often asked what the biggest differences are between public and academic libraries and librarianship. My short answer is usually something about having only worked for one (each excellent and probably non-standard) example of each so it’s difficult to know if the differences I’ve experienced are more organizational or sectoral. However, an increasingly concerning difference is the relationship that public librarians have with the research and evidence base of our profession.

Low public librarian participation in research and publication is not a new phenomenon nor is the small overall percentage of LIS research articles about public library practice. Research in 2005 showed that over a four year period just 3% of article authors in North American LIS journals were employed in public libraries. Even in Public Library Quarterly, only 14% of the authors were public librarians. An earlier study in 2001 showed that only 7% of LIS research articles were public library orientedi.

The recommendations in the 2014 Royal Society of Canada Expert Panel report on Canada’s libraries call for increased sharing of research and statistics to support evidence-based practice in public libraries. The recommendations specifically include a call to action for public libraries to make their work visible by posting evidence-based studies on library websites for the benefit of the entire library community, in addition to continuing to share statistical data freely with CULC and other organizationsii.

These recommendations follow from the fact that public libraries are increasingly called upon to show their value and prove their impact yet we are not actively in charge of telling our own story by sharing our organization practice findings or enlisting our librarians to share their work outside of internal operational functions. We need to heed this call to action both as organizations and as individual professionals. I am keenly aware of all of the good program evaluation and assessment work that goes on in public libraries to inform services and innovation yet it is too frequently not taken the step further, to openly available publication, to build our evidence-base, inform our collective practice, and be available to tell our stories.

Of particular note in this call to action is to openly and freely post this work of our public libraries and librarians. A very distinct and frustrating difference between academic and public librarianship is access to the literature behind paywalls. I am well-aware of how frequently I beg sharing of PDF articles of academic colleagues and also, embarrassingly, how less frequently I dip into the literature because access to it isn’t as seamless as it was when I was an academic librarian. Open Access publishing options for our own literature needs a much higher profile than it currently has and is something our entire sector needs to work on.

Where to start? As examples, Edmonton Public Library (EPL) recognizes that research and its dissemination are integral to being innovative. EPL provides two recent librarian graduates from the University of Alberta’s School of Library and Information Studies with one year research internships. These new professional librarians conduct research that is invaluable to EPL’s future planning. Recent assignments on digital public spaces and open data; digital discovery and access; 21st century library spaces; and analyzing the nature and types of questions received at service desks have also included the expectation of openly sharing internal reportsiii via the EPL website, as well as publication in Open Access forumsiv v vi vii. Librarians working on innovative projects are also encouraged to share their practice and findings openlyviii ix. Providing the encouragement, support, time, and expectation that sharing need be an integrated part of public librarian practice is something all libraries can foster. We need to collectively take responsibility for changing public library culture and take ownership of telling our own stories and sharing our evidence.
iRyan, Pam. 2012. EBLIP and Public Libraries. Evidence Based Library and Information Practice. Vol 7:1.

iiDemers, Patricia (chair), Guylaine Beaudry, Pamela Bjornson, Michael Carroll, Carol Couture, Charlotte Gray, Judith Hare, Ernie Ingles, Eric Ketelaar, Gerald McMaster, Ken Roberts. (2014). Expert Panel Report on The Future Now: Canada’s Libraries, Archives, and Public Memory. Royal Society of Canada, Ottawa, ON. Pg. 120.

iiiPublications. Edmonton Public Library.

ivArnason, Holly Kristin and Louise Reimer. 2012. Analyzing Public Library Service Interactions to Improve Public Library Customer Service and Technology Systems. EBLIP and Public Libraries. Evidence Based Library and Information Practice. Vol 7:1.

vWortman, Beth. 2012. What Are They Doing and What Do They Want: The Library Spaces Customer Survey at Edmonton Public Library. Partnership: the Canadian Journal of Library and Information Practice and Research. Vol 7:2.

viDaSilva, Allison. 2014. Enriching Discovery Layers: A Product Comparison of Content Enrichment Services Syndetic Solutions and Content Café 2. Partnership: the Canadian Journal of Library and Information Practice and Research. Vol 9:2.

viiCarruthers, Alex. 2014. Open Data Day Hackathon 2014 at Edmonton Public Library. Partnership: the Canadian Journal of Library and Information Practice and Research. Vol 9:2.

viiiHaug, Carla. 2014. Here’s How We Did It: The Story of the EPL Makerspace. Felicter. Vol 60:1.

ixCarruthers, Alex. 2015. Edmonton Public Library’s First Digital Public Space. The Library as Incubator Project. January 20, 2015:

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.