The right tool for the job: NVivo software for thematic analysis

by Carolyn Doi
Education and Music Library, University of Saskatchewan

This post builds off of an earlier one by research assistant Veronica Kmiech, which outlines the process for searching and identifying literature on the topic of how practitioners in cultural heritage organizations manage local music collections.1 I have worked with Veronica since summer 2016 on this project, which led to a thematic analysis of the literature seeking to better understand the professional practices implemented and challenges faced in managing, preserving and providing access to local music collections in libraries and archives.2

Using NVivo to facilitate the thematic analysis in this project was ultimately extremely helpful in organizing and managing the data. With over fifty sources to analyze in this review, the thought of doing this work manually seemed daunting.

Thematic analysis typically encompasses steps which take the researcher from familiarization of the data, through development of codes and themes, and finally to being able to tie these themes to the broader picture within the literature.3 NVivo becomes particularly useful at the stages of coding and theme development.

During the coding phase, NVivo will help save descriptions, inclusion, and exclusion criteria for each code. These are fairly easy to change as needed, being able to see an overview of the codes you are working with is definitely helpful, and it is easy to create hierarchies within the node sets. Once code labels are identified, coding the dataset involves (a lot!) of highlighting and decisions about which node(s) to assign to that piece of text. Adding new nodes is fairly simple, as there will likely be themes that come up throughout the coding process. Word to the wise: coding is made easier with NVivo, but the software doesn’t do all the work for you. Schedule extra time for this portion of the research.

During the phase of theme development and organization, NVivo made it quite easy to sort nodes into broader themes. In practice, this process took a few revisions in order to fully think through how and why nodes should be sorted and organized. The software has some features that assist with finding significance within the themes including ability to make mind maps, charts, and word frequency queries. After this process, I identified five broad themes were identified within the literature, some with as few as three associated nodes, and some with as many as thirteen (fig. 1).

Figure 1: Themes and node hierarchy

Following the development of this hierarchy, I went back into the literature, to find examples of how each theme was applied and referred to in the literature. When presenting the analysis portion, these examples were helpful in illustrating the underlying narrative.

This example (fig. 2) shows nodes found within the theme which brings together data on the theme of why practitioners choose to collect local music.

Figure 2: Goal and Objective theme

To better illustrate the significance or application of these concepts, I used quotes from the literature as examples. This excerpt works particularly well as an illustration of why heritage organizations might choose to collect local music, why it may present challenges, and why it can be considered unique:

The Louisville Underground Music Archives (LUMA) project was born of the need to document this particular, and important, slice of Louisville’s musical culture. …from a diverse community of bands and musicians, venue and store owners, recording studios and label managers, and fans to maintain the entire story from a broad range of perspectives.4

Pulling quotes such as this one helped me to build a narrative around the themes I’d identified, and serve to provide a gateway into the literature being analyzed.

The process of analyzing the data this way provided me with a rich resource on which to build the literature review, and a unique map of what the literature represents. While NVivo has some flaws and drawbacks (price, switching between operating systems, and working collaboratively were notable obstacles), the benefits outweighed them in the end (quick learning curve, saves the time of the researcher, assists considerably with organization of data and thematic synthesis). I highly recommend NVivo as a tool to keep in your back pocket for future qualitative analysis projects.

1 “Locating the local: A literature review and analysis of local music collections.”
2 Results from this analysis were recently presented during the 2017 annual meeting of the Canadian Association of Music Libraries (CAML) in Toronto, ON in a paper titled Regional music collection practices in libraries: A qualitative systematic review and thematic analysis of the literature.
3 “About Thematic Analysis.” University of Auckland.
4 Caroline Daniels, Heather Fox, Sarah-Jane Poindexter, and Elizabeth Reilly. Saving All the Freaks on the Life Raft: Blending Documentation Strategy with Community Engagement to Build a Local Music Archives. The American Archivist, Vol. 78, No. 1 (2015): 238–261.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Grey areas in research

by Christine Neilson, Knowledge Synthesis Librarian
Neil John Maclean Health Sciences Library
Centre for Healthcare Innovation
University of Manitoba

Through the course of my day-to-day duties, I came across an interesting article by Adams et al. about searching for “grey information” for the purposes of evidence synthesis in public health. For anyone who is unfamiliar with evidence synthesis, evidence synthesis is much more than a literature review. It involves identifying and tracking down all relevant evidence, evaluating it, extracting data, contacting authors to request additional data that is not included in a published article (where applicable), and using that larger pool of data to answer the question at hand. A thorough evidence synthesis includes grey literature – literature that is published by an entity that’s main business is something other than publishing – in an attempt to reduce bias. There can be some seriously heavy statistical analysis involved, and the entire process is a heck of a pile of work. Adams et al. took the idea of grey literature, extended it to other information that is difficult to get hold of, and provided a critical reflection of three separate projects where they relied heavily on “grey information”. When I read their article, I was struck by two things.

First, Adams and colleagues were interested in public health programs that came out of practice, rather than formal research. As they point out, “Interventions and evaluations that were primarily conducted as part of, or to inform, practice may be particularly unlikely to be described in peer-reviewed publications or even formally documented in reports available to others in electronic or hard copy. Information on these activities may, instead, be stored in more private or informal spaces such as meeting notes, emails, or even just in people’s memories.” To me, this statement applies as much to librarianship as it does to public health. I can’t imagine how many awesome library programs and practices we could learn from, except for the fact that few of us have heard about them.

The second thing that struck me as I read this article was that even though the authors conceded that their work was “verging” on primary research, they considered these projects to be evidence syntheses instead. But evidence synthesis relies on published information. Rather than ask for additional information to clarify the data they collected from a published source, the authors gathered new information by interviewing key informants, so to me, they were conducting primary research: full stop. The authors seemed to know what an evidence synthesis actually entails – not everyone can say the same – so I wonder: the work they did was a legitimate form of research so why would they label it as evidence synthesis? Are the lines between different forms of research really that blurry? Were they trying to avoid going through the REB process? Or were they concerned their work wouldn’t have the status associated with an evidence synthesis and so they named it to their liking?

I think that sometimes we don’t realize that library research has a lot in common with research in other fields. Like the field of public health, there is so much useful information about our practice that is not widely available or findable. I think we also have our go-to research methods, and opinions about what kinds of publications count… and what don’t. The “how we done it good” articles that simply describe a program or activity have gotten a bit of a bad rap in recent memory. I do agree with those who say that we need more rigorous, research-oriented library publications in general. But simply sharing what was done informs us of what is going on in library practice in a discoverable way. Perhaps we should not be so quick to discourage it.


Adams J et al. Searching and synthesising ‘grey literature’ and ‘grey information’ in public health: critical reflections on three case studies. Systematic Reviews. 2016;5(1):164.
Available online at:

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

In The End, It All Starts With Really Good Questions!

by Angie Gerrard, Student Learning Services, University of Saskatchewan

While attending an experiential learning showcase on my campus a few weeks ago, I was struck by a common theme mentioned by several faculty presenters. Faculty who work with students undertaking original research projects noted that a common challenge for students was identifying a research question. A particular faculty member surveyed her students on their experiences with the course research project and students reported that articulating a research question was the most difficult part of the entire research project. An interesting side note is that students reported that analyzing their data was the most valuable part of the process.

The challenge of formulating research questions piqued my interest as a librarian as we are often on the front lines assisting students with the evolution of their topic as the research process unfolds. We often help students navigate the iterative processes of exploring a topic, brainstorming potential avenues of research, asking different questions, undertaking initial searches in the literature, narrowing the scope of a question or alternatively broadening the scope, all the while tweaking the research question and trying to avoid the dreaded ‘maybe I should just switch my topic’.  I often wonder if there is an understanding of the time commitment and perseverance required for these initial, complex processes in the research cycle.  Clearly, students are struggling with this, as shown above; this challenge was echoed in Project Information Literacy’s findings where they asked students what was most difficult about research; 84% reported that getting started was the most challenging (Project Information Literacy, n.d.).

We know that students struggle with these initial stages of the research process, so what can librarians and faculty do to help students get past the hurdle of formulating good research questions? Here are a few suggestions.

Be explicit about the process. Research is iterative, messy, and time-consuming and often students who are new to academic research may arrive with a more linear mental model of the research process. To illustrate that research is a process, it is powerful to show students how to take broad course-related research topics, break them down into potential research questions, discuss how the questions evolve once one gets a taste of the literature and how further refinement of the question takes place as the process continues. By being explicit about the process, students have a better understanding that the broad topic they start with often evolves into something much more meaningful, unexpected, or interesting.

Encourage curiosity in the research process. At the campus event I alluded to, when I asked faculty how they dealt with students’ struggles with identifying research questions, they all reported the importance of students picking something that interests them, something they are curious about.  Anne-Marie Deitering and Hannah Gascho Rempel (2017), librarians at Oregon State University, recognized the overwhelming lack of curiosity expressed by students in their study, when these students were asked to reflect on their own research process. In response, the authors recommend “that as instruction librarians we needed to enter the process earlier, at the topic selection stage, and that we needed to think more intentionally about how to create an environment that encourages curiosity” (pg 3). In their awesome paper, the authors discuss different strategies they used with first-year students to encourage curiosity-driven research.

Start with a juicy source or artifact! Chat with faculty and ask them to recommend a subject-specific editorial, news article, blog posting, etc. that is controversial and/or thought provoking.  These sources can be old or new; the point is that students start with intriguing sources, not a pre-determined list of research topics. Students examine the sources then begin to develop various lines of inquiry, which evolve into research questions.

Use the Question Formulation Technique (QFT). Although this technique was developed for the K-12 environment, the approach can be adapted to higher education and beyond.  The QFT has six steps, as summarized in the Harvard Education Letter:

  • Step 1: Teachers Design a Question Focus. This question focus is a prompt in any form (visual, print, oral) that is meant to pique students’ interests and stimulate various questions.
  • Step 2: Students Produce Questions. Students note questions following a set of four rules: ask as many questions as you can; do not stop to discuss, judge, or answer any of the questions; write down every question exactly as it was stated; and change any statements into questions.
  • Step 3: Students Improve Their Questions. Students identify their questions are either open- or closed-ended and flip the questions into the alternative form.
  • Step 4: Students Prioritize Their Questions. With the assistance of the teacher, students sort and identify their top questions. Students move from divergent thinking (brainstorming) to convergent thinking (categorizing and prioritizing).
  • Step 5: Students and Teachers Decide on Next Steps. This stage is context specific where students and teachers discuss how they are going to use the identified questions.
  • Step 6: Students Reflect on What They Have Learned. This final step allows for students to develop their metacognitive / reflective thinking (Rothstein & Santana, 2011).

Rothstein and Santana (2011) note that “(w)hen students know how to ask their own questions, they take greater ownership of their learning, deepen comprehension, and make new connections and discoveries on their own. However, this skill is rarely, if ever, deliberately taught to students from kindergarten through high school. Typically, questions are seen as the province of teachers, who spend years figuring out how to craft questions and fine-tune them to stimulate students’ curiosity or engage them more effectively. We have found that teaching students to ask their own questions can accomplish these same goals while teaching a critical lifelong skill” (para. 3).

We know that formulating research questions can be a challenge for students. Being honest, explicit and transparent about this process may help students in tackling this challenge. I think we could all agree that encouraging curiosity in research and asking meaningful questions is not something that is confined to academia but rather are characteristics seen in lifelong learners.

In the end, it all starts with really good questions!


Deitering, A.-M., & Rempel, H. G. (2017, February 22). Sparking curiosity – librarians’ role in encouraging exploration. In the Library with the Lead Pipe. Retrieved from

Project Information Literacy. (n.d.). Project Information Literacy: A national study about college students’ research habits [Infographic].  Retrieved from

Rothstein, D., & Santana, L. (2011). Teaching students to ask their own questions. Harvard Education Letter, 27(5). Retrieved from

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Impactful research

by Nicole Eva-Rice, Liaison Librarian for Management, Economics, Political Science, and Agriculture Studies, University of Lethbridge Library

Why do we do research? Is it simply to fulfill our obligations for tenure and promotion? Is it to satisfy our curiosity about some phenomenon? Or is it to help our fellow librarians (or researchers in another discipline) to do their jobs, or further the knowledge in our field?

I find myself grappling with these thoughts when embarking on a new research project. Sometimes it’s difficult to see the point of our research when we are stuck on the ‘publish or perish’ hamster wheel, and I suspect it’s all the more so for faculty outside of librarianship. It’s wonderful when we have an obvious course set out for us and can see the practical applications of our research – finding a cure for a disease, for example, or a way to improve school curriculum – but what if the nature of our research is more esoteric? Does the world need another article on the philosophy of librarianship, or the creative process in research methods? Or are these ‘make work’ projects for scholars who must research in order to survive in academe?

My most satisfying research experiences, and the ones I most appreciate from others, have to do with practical aspects of my job. I love research that can directly inform my day to day work, and know that any decisions I make based on that research have been grounded in evidence. If someone has researched the effectiveness of flipping a one-shot and can show me if it’s better or worse than the alternative, I am very appreciative of their efforts both in performing the study and publishing their results as I can benefit directly from their experience. Likewise, if someone publishes an article on how they systematically analyzed their serials collections to make cuts, I can put their practices to use in my own library. I may not cite those articles – in fact, most people won’t unless they do further research along that line – but they have a direct impact on the field of librarianship. Unfortunately, that impact is invisible to the author/researchers, unless we make a point of making contact with them and telling them how we were able to apply their research in our own institutions (and I don’t know about you, but I have never done that nor has it occurred to me to do that until just this minute). So measuring ‘impact’ by citations, tweets, or downloads just doesn’t do justice to the true impact of that article. Even a philosophy of librarianship article could have serious ‘impact’ in the way that it affects the way someone approaches their job – but unless the reader goes on to write another article citing it, that original article doesn’t have anything that proves the very real impact it has made.

In fact, the research doesn’t even have to result in a scholarly article – if I read a blog post on some of these topics, I might still be able to benefit from them and use the ideas in my own practice. Of course, this depends on exactly what the content is and how much rigor you need in replicating the procedure in your own institution, but sometimes I find blog posts more useful in my day-to-day practice than the actual scholarly articles. Even the philosophical-type posts are more easily digested and contemplated in the length and tone provided in a more informal publication.

This is all to say that I think the way we measure and value academic research is seriously flawed – something many librarians (and other academics) would agree with, but that others in academia still strongly adhere to. This is becoming almost a moral issue for me. Why does everything have to be measurable? Why can’t STP committees take the research project as described at face value, and accept other types of impact it could have on readers/policy makers/practitioners rather than assigning a numerical value based on where it was published and how many times it was cited?

When I hear other faculty members discussing their research, even if I don’t know anything about their subject area, I can often tell if it will have ‘real’ impact or not. The health sciences researcher whose report to the government resulted in policy change obviously had a real impact – but she won’t have a peer-reviewed article to list on her CV (unless she goes out of her way to create one to satisfy the process) nor will she likely have citations (unless the aforementioned article is written). It also makes me think about my next idea for a research project, which is truly just something I’ve been curious about, but which I can’t see many practical implications for other than to serve others’ curiosity. It’s a departure for me because I am usually the most practical of people and my research usually has to serve the dual purpose of both having application in my current workplace as well as becoming fodder for another line on my CV. As I have been thinking about the implication of impact more and more, I realize that as publicly paid employees, perhaps we have an obligation to make our research have as wide a practical impact as possible. What do you think? Have we moved beyond the luxury of researching for research’s sake? As employees of public institutions, do we have a societal impact to produce practical outcomes? I’m curious as to what others think and would love to continue the conversation.

For more on impact and what can count as evidence of it, please see Farah Friesen’s previous posts on this blog, What “counts” as evidence of impact? Part 1 and Part 2.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

(Small) public libraries do research too!

By Meghan O’Leary, MLIS, Collections and Reader’s Advisory Librarian, John M. Cuelenaere Public Library

Last October I attended the Centre of Evidence-Based Library and Information Practice Fall Symposium and quickly came to the realization that I was the only public librarian in attendance and the year before that there were only two of us. Almost all the presentations were geared towards special or academic libraries, which got me thinking, “Hey! Public librarians do this kind of research too!”

Of course, public libraries do research! Admittedly, research in the LIS discipline is dominated by academic librarians. Even research about public libraries tends to be done mostly by academic librarians. Why is that? Public librarians do not need to publish in the same way that academic librarians need to, but why don’t we publish more research? Do we not have the time or funding? Do we not consider what we do as research worth publishing? These are important questions, but not what I want to discuss today.

What I do want to talk about is what small public libraries, specifically the one I work at, does as far as research is concerned. But, first, some background information. I live in Prince Albert, Saskatchewan and work as the Collections and Reader’s Advisory Librarian at John M. Cuelenaere Public Library. Prince Albert has one full branch and one satellite branch out on the west side of the city and a population of roughly 40,000 people. Compared to Saskatoon, Regina, Edmonton, Calgary, etc. we are a rather small library.

Small public libraries, like mine, do engage in research. However, the research we do is generally not seen as “traditional” research because data collection is usually an ongoing process and we often do not share it with the LIS community. Matthews (2013) offers a model of “Try, Assess, and Reflect” for public libraries embracing evidence-based librarianship and says, “try something, gather some data about the effectiveness of the change, and then make some adjustments” (p. 28). Here’s an example of how we used this model: A couple of years ago we looked at what other libraries were doing and made the decision to launch a small video game collection. After a few months, I gathered statistical information about the new collection. Based on that we tweaked how we were doing things. Some of the items were not being returned, so we limited checkouts to two games per patron. E-rated games were being used more than M-rated games, therefore I altered my buying habits accordingly. Each month I gather statistical data on the whole collection to see what is being used, what is not being used, and what current trends are.

That is an example of how small public libraries use quantitative research methods to guide change; however, there has been a shift in research trends in the LIS community from quantitative to qualitative methodologies. Another project I want to talk about is our most recent strategic planning project. It has been ongoing for a few months now and we have done various different types of information gathering. We use statistical data like gate counts, usage stats, website metrics, etc. to guide us in creating a new strategic plan, but we also had three separate strategic planning sessions where we gathered qualitative data. Our first session was with the members of our board and library management, the second was with the rest of the library staff, and finally, the third session was held with the public. The major topics up for discussion were Facilities, Technology, Collections, Programs, and Community Outreach. The topics were written on large pieces of paper posted around the room, then everyone who attended the session was given a marker (and a cookie, because you have to lure them in somehow) and asked to go around the room and write their ideas under each heading. Each session built on the previous session and we analyzed the information gathered and have started developing a work plan which will target each of the major points. The information gathered has already helped us with the designs for our renovation project, as well as with our budget allocations.

I could write more about the various types of research small public libraries, such as John M. Cuelenaere Public Library, do but I do not want to turn this blog post into an essay! If there are any Brain-Works blog readers out there who are also from public libraries and conduct other forms of research please comment! I would love to hear what other public libraries (large or small) are doing.


Matthews, J. R. (2013). Research-based planning for public libraries increasing relevance in the digital age. Santa Barbara, CA: Libraries Unlimited.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The first few weeks of sabbatical – Time to focus!

by Laura Newton Miller, on sabbatical from Carleton University

I’m lucky to be in the beginning weeks of a one-year sabbatical.  This is my second sabbatical, and I seem to be approaching this one a little differently than 7 years ago.

Unlike my first sabbatical, I started this one with a once-in-a-lifetime family trip to New Zealand. Although it was mostly a holiday, I did have the opportunity to meet and discuss all-things-library with Janet Fletcher and some of the lovely staff at Victoria University of Wellington.  I love seeing how other libraries do things, and our discussions really helped me to focus on my particular research. I was also able to discuss my research focus with family in Wellington, and really appreciated how I can apply their non-library perspective to my own work.

Knowing that I was taking this holiday, I did a lot of initial research pre-sabbatical (ethics approval, survey implementation) so that when I returned, I’d be able to immediately sink my teeth into the analysis. This is different than my first sabbatical, where I started work right away.

So what have I learned so far in my second sabbatical? I will readily admit that I probably have more questions than answers at this point, but I do have some tidbits of what to watch out for….

Limit social media

  • I know, I know – we know this – but it’s tricky sometimes! I find this very easy to do while on vacation, but the combination of jet-lag and arriving back just in time for a lot of turmoil south of the Canadian border made it very difficult to focus my first week back. I’m finding staying off social media a little more difficult this time around, but am aiming to limit myself to checking less often.

Find the time to work

  • I have school-age kids. I’m not sure if the winter weather was better the last sabbatical or not, but my kids seem to be around more because of storm cancellations or catching some sickness/bug. It makes it difficult to try and work during perhaps more “traditional” hours. I’m happy to be there for them, but finding that quiet time can sometimes be a challenge.

I still love analysis

  • I’m reading through comments from my survey. It was overwhelming at first- just sort of “swimming” in all the data, trying to figure out the themes and ways to code things. I’ve finally reached a breakthrough, which is exciting in itself, but even when I’m floundering I still just love it. I’m so excited for all of the things I’m going to learn this year.

I MAY have taken on too much .

Take a vacation/significant break before sinking teeth into work

  • Since I’m really at the beginning of everything right now, I’m still on the fence on whether or not this has helped my productivity. But it has been wonderful to give myself space between my work life and my sabbatical life- to have a chance to “let go” of some of the work-related things and to really focus. Which leads me to….

Stay off work emails

  • I found this very easy to do for my first sabbatical. Because I’m at a different point in my career now, I find myself checking my email *sometimes* this time around. But I try to limit it to infrequently getting rid of junk mail and catching up on major work-related news.

Do you have any tips on staying focused? I would love to hear them. I’m excited and energized about what my sabbatical year holds!

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Leaning In or Leaning Back?

by Marjorie Mitchell
Research Librarian, UBC Okanagan Library

With apologies to Sheryl Sandberg.

I am going to admit it here first – I’m going through a bit of a dry spell with my research. Actually, it’s not the research – that feels like it’s making some forward movement for the time being. No, the dry spell I’m experiencing has to do more with disseminating my research results and getting my findings “out there” than it has to do with the “research” per se. I’ve recently had proposals for two conference presentations (one traditional presentation and one poster) rejected. Now I’m in a bit of a quandary. I’m trying to read the message of these two rejections to determine whether I should continue this line of research or not. Do I continue and hope the results of my research will be more convincing and compelling nearer completion (leaning in) or maybe it’s a good time to adjust the focus of my research (leaning back).

Research is a funny thing. While many of us conduct research for reasons like “contributing to the profession” or “out of curiosity” or “because I’m required to do research for tenure and/or promotion”, few of us spend enough time determining whether our topic meets the criteria Hollister (2013) called “noteworthy”. Basically, he is pointing out the importance of saying something new, or utilizing something existing, either a theory or method, in a new and unique way.

I would take this a step further and say that a topic also needs to be timely. If your topic has already been written about and presented on many times, it might be that the topic has become stale, even if you have found something new to add to the knowledge about the topic. Another contribution to a topic that has occupied our professional attention for some time just isn’t as appealing as something newer. There is also the problem of being too new. There are some topics and ideas that are just a bit too far ahead of the crowd and won’t be accepted in the current round of conferences and upcoming journals.

Some ideas are just ahead of what the profession is ready to be discussing at any given time. No matter how well composed, researched and executed, an idea that is ahead of its time will fall on deaf ears. You may have had the experience of coming up with a topic and pitching it, only to see it presented by someone else two years later at your favorite conference. There is no quick or easy solution to this. You can only console yourself with a hot cup of tea, secure in the knowledge that you had that idea first.

I think one solution to the issue of being timely is also to develop a certain passionate detachment to the research you’re doing. Research needs a certain amount of objectivity, but I truly believe research needs passion and enthusiasm to carry it forward. I’ve come to recognize, however, I also require a certain amount of detachment, particularly at the conclusion of my research, to allow me to withstand the rejections my ideas sometimes receive.

Sometimes it is worthwhile to step back from the research, particularly after a rejection, and honestly weigh whether the research is still worth pursuing and finishing. It may be your great idea is just a little too late. For now, I’m going to take my proposals to a colleague and get a second, less biased, look at them before I make any decisions. So, before I lean in any direction, I’m going to lean on a friend for advice. I don’t think Sheryl mentioned that kind of leaning.


Hollister, C. V. (2013). Handbook of academic writing for librarians. Chicago: Association of College & Research Libraries.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Locating the Local: A Literature Review and Analysis of Local Music Collections

by Veronica Kmiech, BMUHON, College of Education, University of Saskatchewan

This work is part of a larger research project titled “Local Music Collections” led by Music Librarian Carolyn Doi and funded by the University of Saskatchewan President’s SSHRC research fund. A post from Carolyn’s perspective on managing this project will be published in 2017 on the C-EBLIP Blog.


Research is to see what everybody else has seen, and to think what nobody else has thought.
Albert Szent-Gyorgy1

At this point in my university career, I have written several research papers, most of which were for the musicology courses I took as part of my music degree. This research gave me familiarity with the library catalogue, online databases for musicological articles, interlibrary loan, and contacting European collections to request material (this last one involved an interesting 4 a.m. phone call). As a Research Assistant, my background was helpful, but I found the depth of searching needed for the literature review much greater than anything I had done before.

My role as a research assistant for music librarian Carolyn Doi involved searching for sources, screening those sources based on their relevance to the project, and using NVivo software to identify themes in the literature.


The aim in doing the Literature Review was to find sources that discuss local music collections, especially those found in libraries. With these results, a survey to accumulate information on current practices for managing local music collections is under development.

It was important to find as many sources as possible, across a wide geographic area and collection types, although the majority came from North America. Reading sources from all over the world that talk about collections in a range of settings (e.g. libraries, churches, privately built, etc.) increased my understanding of the contexts that exist for local music collections.

One of the most important parts of the Literature Review was to find as many items relating to local music collections as possible, or in other words – FIND ALL THE SOURCES!
There were thirteen sources that became a jumping-off point, providing guidelines for how to focus the literature review. From here, I searched for literature in a variety of locations including USearch, Google Scholar, Library and Information Studies (LIS) databases, music databases, education databases, newspaper databases, humanities databases, and a database for dissertations and theses.

As a music student, I was familiar with the library catalogue and databases such as JSTOR. However, I was not familiar with the LIS or the Education databases. There were a variety of articles from journals, books, and newspapers that described different types and aspects of local music collections. One point of interest was the range of collection types, which appear in academic libraries and public libraries, to private and government archives. Most of the sources were case studies, which discussed the challenges and successes of a particular collection.

Other sources of information were print works from the University of Saskatchewan Library and Interlibrary Loan, conference abstracts and listserv conversations from the International Association of Music Libraries, Archives and Documentation Centres (IAML), the Canadian Association of Music Libraries, Archives and Documentation Centres (CAML), and the Music Library Association (MLA).

After completing the search, 408 unique results were saved. Although many of the same sources appeared in different search locations, Figure 1 shows where documents were first located.

The majority of the sources came from North America and Europe. It is worth noting that this may be a result of the databases searched, rather than an indication of absence of local music collections and the study of such in other parts of the globe.

Figure 1: Pie chart showing all 408 saved documents based on search location
Figure 1: Pie chart showing all 408 saved documents based on search location.3

Challenges & Limitations

The common challenge, regardless of the database being searched, was finding effective search terms for finding relevant sources. It was important when searching in places like Google Scholar, JSTOR, and USearch to narrow the parameters considerably; otherwise one would obtain thousands of hits. Full-text searches, for example, were not helpful.
Comparatively, some of the LIS databases and ERIC, an education database, required only a keyword or two to find all of the information relative to local music collections that they contained.


Figure 2Figure 2: Geographical distribution of sources in the literature review.5

I saved 408 sources to Mendeley. These consisted primarily of journal articles describing case studies from a variety of international locations. Three hundred and sixty of the sources came from North America and Europe, with the complete breakdown by continent shown in Figure 2. Since we were more interested in research from North America, it is worth noting that 123 of the 201 North American sources are from the United States, 73 are Canadian, and 5 are from other countries such as Jamaica.

After screening, 59 documents were selected for NVivo content analysis. Documents were included if they spoke directly to the management of local music collections in public institutions. Documents were excluded if they were less relevant to the research topic (for instance, they may describe private collections), or they may be items that provide useful context (for example, this may be a resource on developing sound collections in a library).


For me, completing this literature review was a little bit like a treasure hunt – what could I do to find more information? Where else can I look? This process took me to locations for research that I did not even know existed, like the IAML listserv. And, after accidentally emailing every music librarian on the planet while trying to figure out how to work the thing, I was able to add a new researching tool to my repertoire.

In conclusion, the literature review served as a means for finding sources to analyze. However, it provided more than just a list of articles. The completion of the literature review, although global in scope, created a picture centered on North America, which has been an enormous help in understanding the topic of research. Through this search for documents, it has also been possible to see how it would be best to approach the analysis, based on the what work has already been accomplished and what work still needs to be done in this field.

1Szent-Gyorgyi, Albert. BrainyQuote. “Albert Szent-Gyorgyi Quotes.” Accessed July 22, 2016.
2Imgflip. “Meme Generator.” Accessed May 30, 2016.
3Meta-chart. “Create a Pie Chart.” Accessed September 17, 2016.
4“Meme Generator.”
5“Create a Pie Chart.”

This article gives the views of the author and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Building a positive culture around practitioner research, one symposium at a time

by Christine Neilson
Neil John MacLean Health Sciences Library
Centre for Healthcare Innovation
University of Manitoba

This fall I attended my first C-EBLIP symposium, and it was fantastic. The day was filled with interesting presentations; I had a chance to see old colleagues and meet new people who share an interest in library research; and they gave me bacon for breakfast, which is always a win as far as I’m concerned. Two recurring themes during the day were 1) leading by example, and 2) the personal aspects of doing research (such as dealing with research projects that go off the rails, professional vulnerability, and the dreaded “imposter syndrome”). Both of these themes are important. The first as a call to action. The second as an acknowledgement that research isn’t necessarily easy, but none of us are truly alone and there are things we can do to cope.

Acknowledging and exploring the personal issues that come with conducting research is not something that we tend to talk about. I might tell a trusted colleague that sometimes I’m afraid others will see me as the researcher equivalent of the Allstate DIY-er – all of the enthusiasm and optimism, but none of the skill or ability – but generally, we limit our “official” professional discussion to less sensitive topics. Maybe that’s because we don’t want to admit that there might be any issues. Or maybe it’s because there’s a risk the discussion could degenerate into a pity-party that doesn’t move anyone or anything forward. Either way, I think that this is a topic area that needs to be explored in a constructive way.

The C-EBLIP Symposium was a venue that genuinely felt safe to talk about research and the experience of doing research, and I’m thankful I was able to attend. I’m particularly happy that this year’s presenters will have an opportunity to publish about their presentations in an upcoming issue of Evidence Based Library and Information Practice journal. It’s a great opportunity for presenters to share their research, ideas, and experiences with a wider audience, and it will help ensure that content from the day doesn’t disappear into the ether. Building a culture with certain desired qualities is extremely difficult. I’m encouraged that C-EBLIP is building a positive, supportive culture of practitioner research in librarianship and I hope the momentum continues!

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The “Why?” behind the research

by Andrea Miller-Nesbitt
Liaison Librarian, Schulich Library of Physical Sciences, Life Sciences and Engineering, McGill University

Lorie Kloda
Associate University Librarian, Planning & Community Relations, Concordia University

Megan Fitzgibbons
Innovation Librarian, Centre for Education Futures, University of Western Australia

When reading journal articles reporting original research, content usually follows the IMRAD format: introduction, methods, results, analysis, and discussion. Word count, author guidelines, and other conventions usually mean that the researchers’ motivation for conducting the study are often left out. In this post we present our motivations for conducting a research study on librarians’ participation in journal clubs:

Fitzgibbons, M., Kloda, L., & Miller-Nesbitt, A. (pre-print). Exploring the value of academic librarians’ participation in journal clubs. College & Research Libraries.

Being an evidence-based practitioner can sometimes involve a bit of navel-gazing. Beyond using evidence in our professional work (e.g., for decision-making, evaluating initiatives, etc.), we may likewise ask questions about the outcomes of our own professional development choices.

After three years of facilitating the McGill Library Journal Club, we began to think about ways we could disseminate our experience and lessons learned, and most importantly, how we could determine librarians’ perceived outcomes of participating in a journal club. We felt anecdotally that participating in a journal club is worthwhile, but we wondered: Can we formally investigate the impacts of participation on librarians’ practice and knowledge? What evidence can we find to inform the decisions and approaches of librarians and administrators in supporting or managing a journal club? Is there a connection between journal clubs and evidence-based librarianship? We also wanted to learn more about approaches taken in a variety of journal clubs and how they define success for their group.

The McGill Library Journal Club was initially established in order to help foster evidence-based practice by reflecting on the library and information studies literature and using those reflections to inform practice. The journal club also provides a professional development opportunity for all those interested. Although the McGill Library Journal Club has experienced many of the same challenges as other journal clubs, it is still going strong after 6 years thanks to a core group of motivated facilitators. (For more information about the journal club’s activities, see the McGill Library Journal Club wiki.)

In order to answer these questions, we first had to agree on a definition of a journal club. After some reading and deliberation, we framed participation in a journal club as an informal learning activity: learning that occurs outside classrooms or training sessions, but still involves some coordination and structure. In this context, our research question was: “What do librarians perceive as the value of participating in a journal club?” We focused on academic librarians who participate in journal clubs to manage the scope of the study, but a similar approach could be taken in other library and information organizations as well.

Because we were interested in gaining insight into individuals’ experiences, we considered several methods, and ultimately selected an in-depth qualitative method, the hermeneutic dialectic process (Guba & Lincoln, 1989). This is a method that we have seen used in the social sciences for the purpose of evaluation and reconciling diverse perspectives. At the time we were coming up with our research question, one of the authors (Lorie) was an assessment librarian, and interested in qualitative methods. She brought Guba and Lincoln’s writing to the team for discussion. It seemed both appropriate for answering our research question and also flexible to enable us to be able to really capture study participants’ experiences – not just what we expected to hear. We believe that this is the first use of this method in LIS research, so an additional motivation for the study was to apply the approach in the field.

As per the method, we conducted semi-structured in-depth interviews with each participant. After the first interview, central themes, concepts, ideas, values, concerns and issues that arose in the discussion were written into an initial “construction” which captured the experiences and perceptions expressed by the interviewee. Then in the second interview, the participant was asked to react to some of the points brought up by the first interviewee, as expressed in the construction. The construction was added to after each interview, incorporating the perspectives of each successive interviewee and used to inform the subsequent interviews. At the end, all participants were given the opportunity to comment on the final construction and let us know whether their perspectives were accurately represented.

Ultimately, we believe that the findings of our published study are of interest to librarians who aim to create and sustain a journal club. In particular, it could offer insight as they form goals for their group and justify the activity, including seeking support and recognition for their group.

More details about the impacts of academic librarians’ participation in journal clubs are of course presented in the article. In addition, in collaboration with the C-EBLIP Research Network, we hope to compile additional resources about journal club practices in librarianship and open communication channels in the future. Watch this space, and please get in touch if you have any ideas about promoting journal clubs for academic librarians.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.