Digital humanities and the library: Where do we go from here? C-EBLIP Journal Club, October 6, 2016

by Carolyn Doi
Education and Music Library, University of Saskatchewan

Article: Zhang, Ying, Shu Liu, and Emilee Mathews. 2015. “Convergence of Digital Humanities and Digital Libraries.” Library Management 36 (4): 362-377. http://search.proquest.com/docview/1684384505/abstract/D65EF9A05834DADPQ/2

In October, the C-EBLIP Journal Club met to discuss an article focused on the evolving domain of digital humanities and its role with the academic library. The article in question, “Convergence of Digital Humanities and Digital Libraries” was published by Zhang, Liu and Matthews in Library Management, a journal that aims to “provide international perspectives on library management issues… highly recommended for library managers.”1 The article discussed ways that libraries might support scholarship in digital humanities (DH), digging into aspects of content, technology, and services that the library might develop for digital humanities scholars. I was compelled to select an article that addressed this subject, as I recently attended a web broadcast of the “Collections as Data” livestream where DH and librarianship were discussed together several times2, leading me to consider my own background in musicology and librarianship and how they might overlap through a digital humanities lens.

The members of the journal club chose to assess the article in question from a few different angles: context, audience, methodology, and findings, and conclusions. Our discussion of the article was aided by use of the EBL Critical Appraisal Checklist.3 Developed by Lindsay Glynn, this tool is made up of a series of questions that help guide the reader through assessment of the study including: study design, including population, data collection, study design, and results.4 We found that using the checklist allowed us to think critically about each aspect of the study design, to assess the reliability, validity, and usability within our own professional context. A summary of our discussion is presented below.

Context & Audience

During our conversation, we noted that this article is aimed at library managers, or those who may be in an administrative role looking to gain a quick picture of the role of libraries in interacting with digital humanities scholars. It was noted that the link between libraries and digital humanities has already appeared in the literature on many occasions, and that to get a fuller picture of how libraries might approach this collaborative work, reading other critical opinions will be of utmost importance. One may want to consult the list of resources provided by the dh+lib folks, which can be found on their website, to get a sense of some of the core literature.5

Methods

The methods section of this article describes how the researchers consulted various evidence sources to identify current challenges and opportunities for collaboration between DH and libraries. In this case, the authors state that they have combined findings from a literature review and virtual and physical site visits to “humanities schools, research centers, and academic libraries.” The databases were shared, though search terms were not. We felt that including this information would be helpful both for assessing the quality of the search and for other researchers hoping to replicate or build on the review. The search resulted in 69 articles, 193 websites, and 2 physical site. While discussing the validity of these evidence sources, we felt that while the literature and online site visits may provide a more representative selection of sources to draw conclusions from, the sample of physical sites was not large enough for sufficiently precise estimates.

Findings

Zhang, Ying and Mathews’ findings include both challenges and opportunities for collaboration between DH and digital library communities. Description of how the evidence was weighed or analysed to retrieve these results was not clearly outlined in the paper, and we felt that including such information would assist the reader to evaluate the usefulness and reliability of the findings. A summary of these findings is provided in the accompanying chart.

Challenges Opportunities
• “DH is not necessarily accepted as qualifying scholarship… novel methodologies and the theoretical assumptions behind their work have been questioned by their peers from traditional humanities schools of thought.” • Creating “knowledge through new methods”
• “The DH community has unbalanced geographical and disciplinary distributions… Related DH collections are not yet integrated. These digital collections are distributed in different schools, academic units, museums, archives, and libraries. Few efforts have been made to link related resources together.” • Working “across disciplines [that] are highly collaborative”
• “The technologies used in DH create barriers for new scholars to learn and for projects to be sustainable” • Producing a “unit of currency…[that] is not necessarily an article or a book, but rather, a project…usually published using an open web platform, allowing users to dynamically interact with underlying data,”
• Establishing “major scholarly communication, professionalization, and educational channels”

Conclusions

In the conclusion of the article, Zhang, Ying and Mathers present a positive perspective on the opportunities for collaboration between the DH and library community: “To make collaborative work more successful, we, LIS professionals, need to challenge ourselves to continuously grow new skill sets on top of existing expertise and becoming hybrid professionals. The DL community should strive to make ourselves more visible, valuable, and approachable to the DH community. Even better, the DL community need to become part of the DH community.”

On this point, the journal club’s conversation focussed on the capacity of libraries to take on these new collaborations, and whether we are necessarily prepared for such projects. These thoughts are echoed by Posner, who writes in her article, “No Half Measures: Overcoming Common Challenges to Doing Digital Humanities in the Library” that “DH is possible in a library setting…but that DH is not, and cannot be, business as usual for a library. To succeed at digital humanities, a library must do a great deal more than add ‘digital scholarship’ to an individual librarian’s long string of subject specialties.”6

The domain of DH is compelling and creative: it incorporates new methods, produces innovative means of dissemination, and combines diverse perspectives on research. Libraries are well positioned to contribute to this domain, though exactly how this should or can happen is not found in a one-size-fits-all answer. Zhang, Ying and Mathers present some good points that may serve to begin a conversation on how libraries and DH folks might work together. Further research on each of these points is up for further investigation for the librarian or administrator aiming to implement these strategies in their own institution.

1“Library Management.” https://ulrichsweb.serialssolutions.com/title/1478296246359/117078

2Library of Congress. “Collections as Data: Stewardship and Use Models to Enhance Access” September 27, 2016. Accessed November 4, 2016: http://digitalpreservation.gov/meetings/dcs16.html

3EBL Critical Appraisal Checklist. http://ebltoolkit.pbworks.co/f/EBLCriticalAppraisalChecklist.pdf

4Glynn, Lindsay. “A critical appraisal tool for library and information research”, Library Hi Tech 24, no. 3 (2006): 387 – 399. http://dx.doi.org/10.1108/07378830610692154

5“Readings” dh+lib. Website. Accessed November 4, 2016. http://acrl.ala.org/dh/dh101/readings/

6Posner, Miriam. “No Half Measures: Overcoming Common Challenges to Doing Digital Humanities in the Library.” Journal of Library Administration 53, (2013): 43-52. http://dx.doi.org/10.1080/01930826.2013.756694

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

More Data Please! Research Methods, Libraries, and Geospatial Data Catalogs: C-EBLIP Journal Club, August 25, 2016

by Kristin Bogdan
Engineering and GIS Librarian
University Library, University of Saskatchewan

Article: Kollen, C., Dietz, C., Suh, J., & Lee, A. (2013). Geospatial Data Catalogs: Approaches by Academic Libraries. Journal of Map & Geography Libraries, 9(3), 276-295.

I was excited to have the opportunity to kick-off the C-EBLIP Journal Club after a brief summer hiatus with a topic that is close to my heart – geospatial data! This article was great in the context of C-EBLIP Journal Club because it introduced the basics of geospatial data catalogs and the services around them, and provided an opportunity to look at the methods used by the authors as part of an ALA Map and Geospatial Information Round Table (MAGIRT) subcommittee research project.

Most of the group was unfamiliar with geospatial data catalogs, so the introductory material provided a good base for further discussion. There was good material about the breadth of the different metadata standards involved and how they are applied at the different levels of data detail. There was also good discussion about the importance of collaboration and the OpenGeoportal consortium in developing geospatial data catalogs.

One of the key themes of our discussion was that we would have liked to see more information about the research design and more data. We would have liked to see mention of the ethics process that the authors went through before carrying out their study. Our group had questions about the process that the subcommittee used to choose their sample, as it seemed like it was fairly limited. The authors acknowledge that this was not meant “to create a complete inventory” (p.281), but it seemed like it could have been broader to be more representative. We would also have liked to see the questions that were asked during the interviews and more of the qualitative data from the interviews themselves. It was unclear how structured the conversations with the catalog managers were and how the data presented in the tables and the conclusions were derived. The information presented in the tables was not consistently organized and seemed like it would have been more useful in the context of the interview. The pie chart they used on page 283 to show the “Approaches to Developing Geospatial Data Catalogs” was not as useful as a table of the same information would have been, as there are 5 pie sections to represent 11 data points.

In light of the questions around the data collection, the leap from the tables of responses to the recommendations seemed fairly large. In general, the lists of questions to consider when determining how to implement a geospatial data catalog were helpful but they aren’t really recommendations. The cases that they present provide some ideas about the staffing and skills required to create a geospatial catalog, but they are vague. The first case seemed unnecessary, as it states “The library has determined that there is a clear need to provide access to the library’s spatial data and other spatial data needed by the library’s customers. However, the library does not have the technology, staffing, or funding needed to develop a spatial data catalog.” It would have been nice to see some alternative solutions for those without the ability to create a full-blown data catalog like suggestions about some practices that could be put in place to start building the foundation of a geospatial data catalog like specific cataloging practices or file-type considerations.

Our discussion concluded with reflection on how carefully and critically we read articles in our general research lives. One of the great things about Journal Club is that we have the opportunity to really interrogate and dissect what we are reading. The ensuing discussion is an opportunity to see the article from many different perspectives. This makes us better researchers in two ways: we are trained to more thoroughly evaluate the things we read and we take that into consideration in the research that we produce.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The Problem with the Present: C-EBLIP Journal Club, June 21, 2016

by Stevie Horn
University Archives and Special Collections, University of Saskatchewan

Article: Dupont, Christian & Elizabeth Yakel. “’What’s So Special about Special Collections?’ Or, Assessing the Value Special Collections Bring to Academic Libraries.” Evidence Based Library and Information Practice [Online], 8.2(2013): 9-21. https://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/19615/15221

I was pleased to have the opportunity to lead the last C-EBLIP Journal Club session of the season. I chose an article which looked at the difficulties in employing performance measures to assess the value of a special collection or archives to the academic library. The article has some failings in that it is written largely from a business perspective, and uses special collections and archives interchangeably in a way that becomes problematic if you consider the archives’ responsibility as a repository for institutional records (which go through many different phases of use)—however, it did serve as a useful springboard for our talk.

What interested me was that those present immediately latched on to the problem of “What about preservation value?” when considering the article’s model of measuring performance. The article poses that the best way to measure a special collection/archives’ “return on investment” is not simply by counting the number of times an item is used (a collection-based method), but rather by reporting the number of hours a user spends working with an item, and what the learning outcomes of that use are determined to be (a user-based method) (Dupont and Yakel, 11).

In some ways, a user-centric approach to measuring performance in archives and special collections makes good sense. A single researcher may spend five weeks exploring fifteen boxes, or taking a close look at a single manuscript, and so recording the user hours spent may prove a more accurate measure of use. To reinforce this, there are a number of difficulties in utilizing collection-based metrics with manuscript collections. Individual documents studied within an archival collection are almost impossible to track. Generally a file is treated as an “item”, and the number of files in a box might be averaged. The article points out, accurately, that this imprecision renders collection-based tabulation of archival documents, images, and ephemera virtually “meaningless” (Dupont and Yakel, 14).

However, if the end goal is determining “return on investment”, user-centric data also leaves out a large piece of the picture. This piece is the previously mentioned “preservation value”, or the innate value in safeguarding unique historical documents. Both collection-based and user-based metrics record current usage in order to determine the value of a collection at the present time. This in-the-present approach becomes problematic when applied to a special collections or archives, however, for the simple reason that these bodies not only preserve the past for study in the present, but also for study in the distant future.

To pull apart this problem of using present-based metrics to measure the worth of a future-purposed unit of the academic library, we look at the recent surge in scholarship surrounding aboriginal histories. As Truth and Reconciliation surfaces in the public consciousness, materials which may have been ignored for decades within archival/special collections are now in high demand. Questions of this nature accounted for approximately forty percent of our usage in the last month alone. Had collections-centric or user-centric metrics been applied for those decades of non-use, these materials would have appeared to be of little worth, and the special collections/archives’ “return on investment” may also have been brought into question. The persistence of archives and special collections in preserving unique historic materials regardless of patterns of use means that these materials can play a role in changing perspectives and changing lives nationwide.

If, as Albie Sachs says in his 2006 article on “Archives, Truth, and Reconciliation”, archives and special collections preserve history “for the unborn . . . not, as we used to think, to guard certainty [but] to protect uncertainty because who knows how the future might use those documents”, might not the employment of only present-centric metrics do more damage than good? (Sachs, 14). And, if the value of an archives or special collections cannot be judged solely in the present, but must take an unknown and unknowable future into account, perhaps the formulation of a truly comprehensive measure of “return on investment” in this field is impossible.

Sources:
Dupont, Christian & Elizabeth Yakel. “’What’s So Special about Special Collections?’ Or, Assessing the Value Special Collections Bring to Academic Libraries.” Evidence Based Library and Information Practice [Online], 8.2(2013): 9-21. https://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/19615/15221

Sachs, Albie. “Archives, Truth, and Reconciliation”. Archivaria , 62 (2006): pp. 1 -14.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The Many Benefits of OA and Open Peer Review: C-EBLIP Journal Club, May 10, 2016

by DeDe Dawson
Science Library, University of Saskatchewan

Article:
Tennant JP, Waldner F, Jacques DC et al. The academic, economic and societal impacts of Open Access: an evidence-based review [version 1; referees: 4 approved, 1 approved with reservations]. F1000Research 2016, 5:632 (doi: 10.12688/f1000research.8460.1)

This is arguably the perfect journal club article! A juicy topic with a few points of contention, a journal platform with many innovative features, and open post-publication peer review. Lots and lots to discuss, and indeed we ran out of time. Here I try to summarize our conversation:

I always gravitate towards review articles and highly recommend them to students. They are often the most efficient way to get up to speed on all the relevant literature in a complex area. Open Access (OA) is just such a complex area with multiple overlapping layers of issues, and all progressing so rapidly, that it is the ideal topic for a review. It is ironic because OA itself is such a simple concept. The complexity comes from the challenges of implementation and the multiple stakeholders (and vested interests) involved.

This review article summarizes the main evidence to date on the impact of OA from three perspectives: academic, economic, and societal. They are essentially three lines of reasoning in support of OA. We thought that the strongest, most well-developed argument in favour of OA in this article was the academic. It certainly had the most citations behind it because of the highly productive research area documenting the OA citation effect. We also thought that maybe this academic perspective was focused on by the authors because it was the most likely to persuade researchers who might be reading this review.

So, was the point of the article to persuade researchers to support OA? The authors have an obvious bias as proponents of OA. But how important is it for authors to be neutral? We thought it was unrealistic to expect authors not to have a bias, and for most kinds of papers authors indeed argue a particular point. But should review papers be different? It was suggested that if authors are clear and upfront about their objectives and competing interests this shouldn’t be a problem.

This brought us to the question of what evidence against OA might there be anyway? (We expose our own pro-OA bias here!). One of the online commenters on the article challenged the authors to provide a more balanced review – but he could not provide the authors with links to literature to support these other anti-OA perspectives. Some of the obvious counter-arguments were already dealt with in the article: such as the rise of deceptive (“predatory”) publishing, and the challenges of paying article processing charges (APCs) for authors without funding or those from underdeveloped countries. Otherwise, it is pretty hard to argue against OA unless you are a commercial publisher (or shareholder) with financial interest in sustaining the current system. The commenter argued that jobs will be lost in the transition. But this is a weak point. Are we to prop up an entire dysfunctional, and inequitable, system for the sake of some jobs? Besides these jobs will likely morph into other more relevant and useful functions. What seemed to emerge from this back-and-forth was that “sustainable” means something completely different to commercial publishers (and their allies) and OA proponents! Publishers are from Mars; OA proponents are from Venus.

Beyond the article itself we had a lot to say about the platform and the open peer review model. The article is essentially still in its pre-print version. It was posted on the F1000Research site before peer review. It was a fascinating process to see the reviewers’ reports as they were submitted, and to watch as others commented on the article and the authors responded. It gave the impression of a proper scholarly conversation taking place. This is ideally what journals should be facilitating. Technology allows this now – so why are so many journals still clinging to outdated formats from the print era?

The “open” nature of the reviews and comments also ensured an appropriate level of civility. Who has not received rude and unproductive comments from a reviewer that feels protected by their anonymity? (There is an entire Tumblr site devoted to such remarks!). However, if the reviewer is obliged to reveal themselves, not just to the authors but to the whole of the readership, then they are more likely to behave diplomatically, and provide constructive and substantiated critiques. This also works in the reviewer’s favour: readers (and evaluators) can plainly see the amount of work and time invested by the reviewer in their function. If the reviewer has spent considerable time in providing a thoughtful review then they can justifiably link to it on their CV and collegial committees can see for themselves the energy the reviewer expended.

We also spoke of how we might use this kind of journal format in information literacy instruction with students. This would more clearly make the point that scholarship is a conversation, and that there are multiple points of view. It would demystify the peer review process too: we can see the issues raised by the reviewers and can follow the paper into its next version seeing how the authors might address these concerns. This process is usually completely hidden from the average reader; so it is difficult for a student to imagine a paper other than the final version.

These various versions of papers do present challenges for the reader in citing though! It seems that all the versions remain on the site and have their own DOIs, but the added complexity in citing remains. This is a relatively minor issue though compared to the benefits of an open scholarly conversation that such a model of peer review allows.

We look forward to seeing the next version of this article and continuing the conversation on the benefits of OA!

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Carrots & Sticks: Encouraging self-archiving in an IR. C-EBLIP Journal Club, Mar 31, 2016

by Shannon Lucky
IT Librarian
University Library, University of Saskatchewan

Article: Betz, S., & Hall, R. (2015). Self-Archiving with Ease in an Institutional Repository: Microinteractions and the User Experience. Information Technology and Libraries, 34(3), 43–58. http://doi.org/10.6017/ital.v34i3.5900

One of the things I love about the C-EBLIP journal club is the ease of having one of my colleagues pick out an interesting article from their area of specialization so I can poke my head into their world for an hour and see what ideas they are wrestling with. As an IT librarian, picking an article creates some anxiety because systems and technology aren’t always that accessible (or interesting) for a diverse audience. I was happy to see Sonya Betz and Robyn Hall’s article pop up on a library tech listserv as it was a great fit for our group.

The University Library currently doesn’t have an institutional repository (IR) for the entire campus, but we do have a DSpace eCommons repository for research by UofS librarians. Because we have all deposited our own work into eCommons our conversation started with a unanimous (good natured) rant about how hard it is to do self-archiving. It is time-consuming and the technology was deemed to be frustrating and unsatisfying. Like other tedious institutional reporting systems, we assumed this was the only way. As one member put it, “I didn’t know we could expect better”.

While we talked about how frustrating the process could be, we also wondered just how much effort, time, and money should be invested in improving a system that we all have to use, but that our library users will never see. When do we make the call that something is good enough and we, or our fellow faculty, can suck it up and figure it out or ask for help? One of my favourite suggestions was that a “good enough” scenario would have the user feeling “the absence of anger”. Apparently the bar is quite low. Betz and Hall talk about some of the barriers to self-archiving but don’t ask why, when contributing to IRs is so difficult, many academics voluntarily submit their work to sites like academia.edu and ResearchGate – what is it they are doing right that we could learn from?

This led to a discussion about what libraries could do to encourage faculty, both within and outside the library, to deposit in an IR. We saw two routes: the carrot and the stick.

1024px-Carrot_and_stick_motivation svg

Carrots:
• Link academic reporting systems together to cut down on the number of places this information needs to be input (e.g. have citations from the IR export to formatted CVs, link ORCHID accounts with IR entries for authority control and better exposure, etc.)
• Group scholarly output for colleges, departments, or research groups together in the IR to show the collective impact of their work
• Gamify the submission process with progress bars, badges, and the ability to level up you scholarly work

Sticks:
• Money. Canada Council requires submission to an IR as a part of their funding model
• Librarians armed with actual sticks going office to office “persuading” scholars to deposit their research

We agreed that libraries don’t wield an effective stick in this scenario. Research services, colleges, and departments have to be the ones to put on the pressure to deposit. Librarians can help make that happen and (hopefully) make it as pain-free as possible.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Describing a phenomenon through experience: C-EBLIP Journal Club, February 16, 2016

by Carolyn Pytlyk
Research Facilitator
University Library, University of Saskatchewan

Article:
Forster, M. (2015). Phenomenography: A methodology for information literacy research. Journal of Librarianship and Information Science, 1–10. doi: 10.1177/ 0961000614566481.

Way back in October at the C-EBLIP Fall Symposium, Margy MacMillan from Mount Royal University talked about phenomenography as a new methodology for conducting research on information literacy. Phenomenography is “the empirical study of the limited number of qualitatively different ways in which various phenomena in, and aspects of, the world around us are experienced” (Marton quoted in Forster, p. 1). Margy’s enthusiasm and excitement for phenomenography certainly piqued my interest. In my conversations with library researchers, research methodology is often to topic of discussion when planning research projects, applying for grants, or developing research budgets and can sometimes be a stumbling block for researchers. As such when it came my turn to convene Journal Club, I thought Forster’s review article might be a good opportunity to explore phenomenography as a viable library research methodology for library researchers.

The majority of our conversation revolved around whether phenomenography was indeed a useful new methodology for conducting library research or not. For the most part, we agreed that from the perspective of the review article, it seemed a rather complex and involved methodology. However, in the end, we couldn’t really tell without actually following a researcher through the process. This review article was a fairly good introduction to and overview of phenomenography but to really understand its complexity, we agreed that we would need to read research employing phenomenography as a methodology to see how it works and if it is really as complex as it seems at the outset.

While presenting an intriguing and possible methodological alternative, this article left us with many more questions than answers. Some questions stemming from this review article include:
1. Is this a useful methodology? Would library researchers use it?
2. Is it a methodology about how we think?
3. How do researchers unobtrusively interview people without priming the participants? Is it even possible?
4. Is it a complex methodology, or does it just seem like it?
5. What are the steps involved? How does someone actually do it?
6. Could it be appropriate for library research other than information literacy (like usability or librarians as researchers)?
7. What other methodologies are out there in other disciplines that are possible for library research?
8. What sorts of learning/training would researchers need before undertaking phenomenography?
9. Do researchers have to be experienced interviewers to use it?

Still, despite the numerous unanswered questions, we were not deterred and were in agreement that we are all keen to learn more about it and its process.

Finally, we rounded out our conversation with the value of review articles, although not all of us are keen on them. (Don’t worry; I won’t name names.). Forster’s article not only opened our eyes to phenomenography as a new methodology; it also opened our eyes to the value of review articles as providing overviews of new methodologies, both as consumers and producers of knowledge.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Useful, Useable, Desireable: C-EBLIP Journal Club, January 7, 2016

by Jaclyn McLean
Collection Services, University Library
University of Saskatchewan

At our first journal club meeting of 2016, I chose an article from Weave; a new peer-reviewed, OA, web-based journal, to start a discussion about usability principles and teams in academic libraries:

Godfrey, K. (2015). Creating a culture of usability. Weave: Journal of Library User Experience, 1(3). http://dx.doi.org/10.3998/weave.12535642.0001.301

I’ve been reading a lot in the areas of usability and user experience (UX), especially in libraries, as I build a foundation of knowledge for my program of research. This article seemed like an interesting introduction for those less familiar with usability principles, and the idea of a culture of usability across the library intrigued me. I also like the Weave editorial philosophy, especially their primary aim “to improve the practice of UX in libraries, and in the process, to help libraries be better, more relevant, more useful, more accessible places.” This aligns well with some of the reasons I picked usability and UX for my research, and ideas I keep in mind in my practice as well. And before I dig into the article, and our discussion, I just want to mention something about usability and UX. Weave is a journal aiming to improve UX, but the article we read was about usability teams and principles. Usability and UX are not the same (though you’ll see the terms used nearly interchangeably at times, incorectly).

Godfrey’s article could be divided into two broad sections: a description of usability and usability teams, and an examination of the local experience at Memorial University Libraries. In the first section, she frames her discussion with a literature search on usability principles and practices, and the newer concept of standing usability teams in libraries. She also discusses the importance of making usability a core concept in all areas of library development – physical, virtual, and service. She describes the core concepts of usability, and how Memorial is consciously applying the idea of examining pain points and other concepts usually confined to online environments, to their physical spaces. The challenges of creating a culture of usability (or of changing any culture), and especially the concept of join-in rather than buy-in when attempting such a significant change were very interesting to think about.

The second section gives an overview of the Memorial University Libraries context, and how the implementation of a usability team went there. Godfrey outlines how the team was formed, what’s been done so far, and some plans for the future. She identifies the creation of their usability team as “the first step to creating a culture of usability and improving the user experience.”

Our discussion ranged widely, from the style of the article, to ideas of usability beyond the web, concepts of building culture, and beyond. Several of us were hungry for more – details of the actual projects undertaken by the usability team and their outcomes – but recognized that this wasn’t the article we had in our hands. This article felt more like an introduction to the concept of standing usability teams in libraries, an overview of usability concepts, and some local experiences rather than a full case study or assessment of a usability team in a library.

The bulk of our discussion focused on local context. We already do a lot of talking about our different cultures and how to build them here, and have focused recently on building cultures in the area of leadership, project management, assessment, and EBLIP. How many cultures can one workplace consciously foster, we wondered? Could we honestly see something like a standing usability team happening here? In the end, we thought that adopting usability concepts and ideas into work we already do, and good standing committees that are already in place would be more successful in our context. In that way, we specifically talked about EBLIP – because by it’s very definition, EBLIP takes into account our users. So maybe rather than adding a new culture shift to our agenda, it’s more about keeping the user aspect of EBLIP in mind when we implement or assess services and programs – and use that as a reminder to stop assuming we know what our users need, or as a reminder to check in with our users on a regular basis.

Libraries have a bad reputation of looking inward and forgetting about our users – so even broad discussions of user preferences and initial user consultation could be a significant improvement. I know from my own area of work (technical services), a key example of how we fall down on user consultation is when a discovery system needs to be reconfigured, and only library staff are consulted for needs/preferences, rather than users.

In the end, this article made us hungry for more. As practitioners, we were immediately curious about the how and the what of the work. We wanted to see the outcomes of the iterative testing, the aggregated responses from the survey, and the results from this standing team. We hope that Godfrey is planning a follow-up with more of the details from on the ground, so we can continue to learn from what seems to be a unique project. Krista, if you’re reading this, I hope that you are planning to share more about the work you’ve done so far and what’s planned next!

Publish or practice, never that simple: C-EBLIP Journal Club, November 17, 2015

by Selinda Berg
Schulich School of Medicine – Windsor Program
Leddy Library, University of Windsor
University Library Researcher in Residence, University of Saskatchewan

As the Researcher-in-Residence, I was very eager to convene the November gathering of the University of Saskatchewan Library’s C-EBLIP Journal Club. I think that this initiative by the Centre (C-EBLIP) is incredibly valuable to librarians: It expands our understanding of the research landscape; increases are understanding our colleague’s research interests; and diversifies our perspectives and deepens our knowledge about research.

The article we discussed in November was:
Finlay, C. F., Ni, C. Tsou, A., Sugimoto, C. R. (2013). Publish or practice?: Examination of librarians’ contributions to research. portal: Libraries and the Academy, 134), 403-421.

In this article, the researchers share the results of their investigation into the authorship of LIS literature with an emphasis on understanding the contributions and attributes of practitioner scholarship. The article intersects well with my own research interests, as well as aligns with many of the ongoing conversations about the research outputs and the research productivity by academic librarians. The conversation was lively, informative, and thoughtful.

The article was well-received by those at journal club with members highlighting the article’s clear methods and style of writing. The discussion was diverse and lead us to many different conversation, but three themes did emerge.

Other possible interpretations and explanations:
The authors found that there was a decrease in the proportion of article published by practitioners between 2006 and 2011. The authors made a couple of suggestions as to why this may have occurred, including the increase in non-traditional publications and the decrease in expectations for research. In addition to these explanations, we discussed other possibilities including a movement away from LIS journals as librarians’ research interests become more diverse; a decrease in tenure-track/tenured librarian positions (resulting in more contract positions without research opportunities and perhaps more practice heavy positions); and/or a change in the nature of articles with a movement away from a focus on quantity of articles to a focus on quality research.

Application of method and findings to the development of institutional standards and a disciplinary research culture:
The discussion led to interesting conversation about how contributions to scholarship are measured, both in relation to our disciplinary research culture as well as institutional standards. As scholarly communications evolve, is the counting of articles in respected journals the only (or best) was to evaluate research contributions? This discussion led us to further consideration about how disciplinary differences in research culture make a difference in the interpretation of contributions, and in turn, the relatively young and immature research culture in academic libraries makes it difficult to name our disciplinary criteria and in turn develop institutional standards.

Related research questions:
The article was really well-received and from good research comes more questions. The article raised some interesting discussion about related research questions that were not within in the scope of the research article. There was an interest in knowing more about the qualities and attributes of the librarians who have been publishing (including their position, their length of service, their motivations for research, and the factors that determined where they publish). There was also questions as to whether these librarians who are contributing to scholarship through “traditional” scholarly venues are also contributing to the scholarly conversations though non-traditional formats (blogs, open publishing etc.). Lastly there was an underlying assumption that these two bodies of literature by two set of authors, LIS scholars and practitioner-scholars interact and impact each other; however, there was an interest in knowing how these two bodies literature, written by two groups of authors actually do interact- for example: are they citing each other, or do they cite their own communities?

Great discussion ensued at the meeting and some stimulating ideas were generated from the many interesting findings within the paper and beyond. Some very thoughtful discussion emerged during journal club—looking forward to Janurary!!

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Data for Librarians: C-EBLIP Journal Club, October 1, 2015

by Kristin Bogdan
Engineering Library, University of Saskatchewan

At the second meeting of the C-EBLIP Journal club for 2015-2016, held on October 1, 2015, we discussed the article:

MacMillan, D. (2014). “Data Sharing and Discovery: What Librarians Need to Know”. The Journal of Academic Librarianship, 40, 541-549. http://dx.doi.org/10.1016/j.acalib.2014.06.011

I chose this article because it is a nice overview of the key things that librarians should be familiar with about data and data management. MacMillan does a great job of synthesizing the information out there and applying it in a Canadian context, where current data management trends are not as driven by granting agencies as they are in other jurisdictions (although that could be coming). There was general agreement that the article was a useful place to start when it comes to understanding where data management can fit into library services and systems.

The flow of the discussion changed as we looked at data sharing and discovery based on the roles that librarians and information scientists fulfill in this context. We recognized that the library is a possible home for research data and that we have a role as educators, curators, and stewards of data, but we are also researchers who consume and produce data. These points of view overlap and complement each other, but also offer different ways of looking at how the library can be involved.

When it comes to our role as curators and stewards of data, we discussed the kinds of things that could make data sharing difficult. The members of the Journal Club acknowledged that there is a difference between being able to find data and being able to provide those data to patrons in a way that is usable and sustainable. Infrastructure is required for data sharing and discovery, and there are many possible ways to make this happen. Should libraries have their own repositories or take advantage of existing repositories? What are the possible down-sides of housing data in institutional repositories instead of those that are discipline-specific (highlighted by MacMillan on page 546)? How can we work together to make the most of our limited resources and provide the most comprehensive services for Canadian researchers? Resources are being collected by the Canadian Association of Research Libraries (CARL), including a list of institutional repositories and adoptive repositories (http://www.carl-abrc.ca/ir.html). We talked briefly about data journals as dissemination venues, but wondered about the implications of publishers owning this content.

Issues around data privacy also came up in the discussion. Concerns were raised around security and the measures in place to make sure the individuals’ identities were protected. The Saskatchewan Research Data Centre (SKY-RDC) was identified as an example of how data can be distributed in a controlled way to protect research subjects (more about the SKY-RDC here: http://library.usask.ca/sky-rdc/index.html). In terms of research data, we came to the conclusion that privacy will trump sharing in terms of sensitive data.

Our role as data producers and consumers brought up concerns about when it was appropriate to release data that was still being written about. The idea of being scooped came up as a possible deterrent to making data public. This applies to “small” data as much as to “big” data. There were also concerns about how data sets would be used after they were made public. What if they were not used in a way that was consistent with their intended purpose? Data documentation can help users understand the data and use it in a way that enriches their research but acknowledges the possible limitations of the original data set. Data citation is an important if still relatively new thing, and part of our role as stewards and creators will be to make citing data as easy and common-place as citing other materials.

In the end, I think this article was a great place to begin the discussion of data sharing and discovery in the context of libraries for the C-EBLIP Journal Club. The discussion generated more questions than answers, which made it clear that this is a topic worthy of further investigation.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Organizational Innovation: C-EBLIP Journal Club, August 20, 2015

by Virginia Wilson
Director, Centre for Evidence Based Library and Information Practice

The first C-EBLIP journal club meeting of the 2015/16 academic year took place on Thursday, August 20, 2015. Despite many librarians being in full-on holiday mode at the time, six participants discussed the following article:

Jantz, R.C. (2015). The Determinants of Organizational Innovation: An Interpretation and Implications of Research Libraries. College and Research Libraries, 76(4), 512-536. http://crl.acrl.org/content/76/4.toc

This article was chosen by me with the idea that innovation was a light and exciting topic and that the article would be perfect for an August journal club meeting. While the article did have some redeeming qualities, the club meeting had a bit of a rocky start: this article is long and slightly unwieldy! Jantz conducted a research study which focused on innovation as a specific type of change and he determined five factors that had a significant impact on how well libraries innovate. The research method consisted of a survey distributed to 50 member libraries of the Association of Research Libraries.

The discussion opened up with some problems around the methodology of the research. The details of how the research was conducted were sketchy. There was no word on how Jantz coded the data, no talk of analysis or collection methods in detail. The survey instrument was not included in the paper, nor were the raw survey results, which, one journal club member commented, would have been the interesting part! In terms of the instrument, it would have been helpful to have a look at it as club members wondered if the author defined the complex terms he used. How can we be sure that all the respondents were on the same page? The author also did not list the specific institutions surveyed, causing us to wonder if they were perhaps skewed to the sciences? Would liberal arts universities/libraries have R&D departments?

Club members found it problematic that the surveys were only administered to senior leadership teams, as views could be quite different down the organizational hierarchy. As well, as the responses were said to have come from senior leadership teams, members were interested in how this might have logistically happened. Did the teams get together to fill out the surveys? Did each team member fill out a survey and were the results of those collated? This is an example of insufficient detail around the methods employed for this research and the problems could have been alleviated with more attention to detail. It was a good takeaway for my own research: be detail oriented, especially when it comes to methodology! If the research is to be useful, it has to be seen as valid and reliable. That won’t happen if the reader is questioning the methods.

Another issue about the paper was that in several instances, the author stated things as established facts but did not cite them. Perhaps the author made assumptions but as we are attuned to the idea of providing evidence for claims, we weren’t buying it. Another take away: in terms of evidence, more is more! Or at the very least, some is vastly better than none. As well, in the conclusion of the paper, the author used the terms vision and mission interchangeably, which particularly irritated one journal club member and was another example of imprecision.

The discussion moved from the particulars of the article to innovation in general with some observations being made:
• Innovation is not dependent on age: people across the age spectrum can be innovative.
• We are overwhelmed by choice. The more choices we have makes making a choice more difficult. Decision-making is difficult.
• Is there a type of innovation on the other side of radical, i.e. useless? Change for change sake?
• One participant felt that innovation of all types can be useless. Innovation doesn’t necessarily create more viable choices.
• Libraries are always playing catch up…it may be innovative to us but not elsewhere [depending of course on the library].
• Difference between innovation and library innovation. Is there a difference between innovation and implementation?

At the very least, C-EBLIP Journal Club members felt that reading about innovation could be valuable when heading into a state of change. And let’s face it, we’re generally dealing with change more often than not these days. To conclude, although the article had some methodological gaps and members felt that the author could have been more selective when transforming his PhD dissertation into an article, the article did give us the basis for a fruitful discussion on innovation and the first meeting of 2015/16 was an hour well spent.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.