When you play the game of information literacy… students win!

by Chris Chan
Head of Information Services at Hong Kong Baptist University Library

“Let’s play a game.” These are not words that I usually utter during library instruction. But it is a phrase that will become more common, if the results of a recent experiment are anything to go by.

During my ten years as an instruction librarian, student engagement during information literacy (IL) sessions has been a persistent challenge. I know that the practices and dispositions that librarians are teaching are vital to their success, and in many cases the students themselves recognize this too. Yet the one-off nature of much of the instruction that takes place at my library means that students are often expected to learn these abilities devoid of any meaningful context.

Over the years, my colleagues and I have experimented with various ways to increase engagement. PowerPoints have been replaced with Prezis, and lengthy librarian monologues or demonstrations are now interspersed with questions posed via online polls that students can respond to via their smartphones. Wherever feasible, hands-on exercises are incorporated into instruction. Nevertheless, in our feedback surveys the effectiveness of activities is consistently rated lower than the relevance of the session itself.

In my interpretation, this is indicative of a need on our part to do more to design engaging instruction sessions. How does this relate to playing games? When implemented effectively, games and gamification have the potential to provide the spark that seems to be missing from our instruction programme. As Jennifer Young writes in her 2016 article on using games to teach information literacy:

Good educational games will motivate and engage students, provide context for information in the course, offer satisfying work that puts students in a state of “flow,” and encourage collaboration and social learning.

Of course, designing a good educational game is easier said than done. Many of the games described in the literature are digital, which presents an additional technical barrier. Recently, however, I stumbled across a physical card game called Search&Destroy. Designed by librarians at Ferris State University, it challenges students to use their database searching abilities to be the last person standing. Essentially, players draw keyword and modifier cards, and must run searches on a chosen database. The goal is to avoid running a search that returns zero results.

This concept intrigued me, and I purchased a copy to experiment with it. First I ran through the game with fellow librarians, and it was a tremendous amount of fun. There is definitely a certain thrill to saddling your opponent with cards that make their searches much more difficult (e.g. item must be in French!).

Our next step was to find out if students enjoyed the game as much as librarians. At HKBU Library we run regular learning events for which students receive a required co-curricular credit. As the Library has control over the content of these sessions, they were a natural place to play the game with students in an informal setting.

So on the afternoon of 5 March 2018 I found myself sitting with a group of six students explaining the rules of the game. I served as a sort of referee, guiding play around the circle. They quickly got the hang of it, and became very engaged in the competitive aspects, with some players forming impromptu alliances to gang up on and eliminate mutual foes.

The design of the game produced many teachable moments. For example, one of the cards allows a player to use the OR operator in their search statements, which led to a discussion of why this is beneficial if your goal is to avoid 0 search results.

At the end of all learning events we do a quick anonymous survey. For the Search&Destroy event, all students either agreed or strongly agreed that they had learned something interesting or useful. Qualitative comments included: “very good game, new experience” and “interesting and good”, which indicate that for this small group at least, the activity was successful in teaching search skills in an engaging manner.

While the overall experience was great, after reflection I identified several areas that could be improved or considered further. First, the game took much longer than the expected 15 minutes. I was hoping to fit in at least two rounds of the game, but ended up with just one that took almost 45 minutes. This was partly due to the number of players, and also due to the fact that we used the Library’s discovery service as the database for the game. Because of its wide coverage of full text content, it took some time before players were at risk of getting 0 results. More specialist disciplinary databases could produce quicker rounds.

Another future consideration is how to best incorporate the game into typical course-integrated instruction (as opposed to a one-off event). An activity like this would be great for introductory first-year courses, but such sessions typically have 20-30 students. Running multiple simultaneous sessions of the game in a class would be possible, but quite intensive in terms of staffing resources.

Scaling up in this way will definitely be a challenge, but it is one that I am keen on exploring after this positive initial experience with game-based instruction.

For those attending LOEX 2018 interested in learning more, librarians from Ferris State will be running an interactive workshop.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The High Hopes for Our Research: Don’t Lose Sight

by Selinda Berg
Leddy Library, University of Windsor

When I deliver professional development workshops on cultivating research topics and creating research questions, I highlight the importance of ensuring that research topics and questions have significance and that researchers can articulate that significance. This is not about statistical significance but rather considering how the question aims to have consequence, impact, and importance. This is imperative because as Huth (1990) notes research is:

“not just baskets carrying unconnected facts like a telephone directory; they are instruments of persuasion. [Research] must argue you into believing what they conclude; and be built on the principles of critical arguments.”

Researchers must be able to articulate why their research matters and what the research sets out to convince the reader. This type of significance of research is an element that is not always strongly articulated in our professional literature. Often, researchers start with a strong understanding of the high hopes and intended impact of their research, but throughout the arduous research process they lose sight of it, and in turn, the readers/audiences cannot see it either. However, articulating the significance of research is important to ensure that the research fits into a wider context, that the research can be built upon to achieve the larger goal, and that the importance of the research is explicit.

Librarians engaging in research to support critical librarianship do this well. They are explicit that they aim to identify, expose and disrupt social and political powers that underlie information systems (Gregory & Higgins, 2013, 3). I think many researchers hope that their research can contribute to a healthier work environment, a stronger profession, or a better society, but they do not situate their research within these higher goals. Many librarians, independent of method or approach, set out to uncover injustices, inequities, or areas where we can just do better within and around our profession, but are not overt in their intentions. We need to consciously and explicitly do this better.

I encourage researchers to not lose sight of the larger goals that inspired them to engage in research, and to use space within presentations and articles to situate their research within a wider context and within their high hopes for how their research might just lead to a stronger profession or better society.

References
Huth, E. J. (1990). How to writing and publish papers in medical sciences. Baltimore: Williams & Wilkins.

Higgins, S. & Gregory, L. (2013). Information literacy and social justice: Rodical professional praxis. Duluth: Library Juice Press.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

An argument for transdisciplinary research for the library and information professions

by Tegan Darnell, Research Librarian, University of Southern Queensland

Put simply, ‘transdisciplinary’ research draws on work from a number of different disciplines to approach a problem or question in a holistic way, but it is distinct from other cross-disciplinary methodologies in that it describes research that attempts to interrogate space across, between, or beyond the disciplines.

Interdisciplinary, cross-disciplinary, and multidisciplinary research remain inside the framework of disciplinary research. A library study that uses a method such as ethnography is one example of ‘interdisciplinary’ research, for example. A ‘transdisciplinary’ approach is one that attempts to understand the wider world in a way that is not possible within disciplinary research.

Transdisciplinary research is a way of attempting to understand and address the complexities of those ‘wicked’ multi-faceted problems that involve human beings, nature, technology and society. Climate change, artificial intelligence, poverty, and health are all areas where transdisciplinary studies are beneficial.

As LIS professionals, we are working in a field that is at the intersection between people, technology, ethics, information, and learning. Allowing ourselves to abandon the rigid ways of thinking established within disciplines such as education, information science, and perhaps even the term ‘evidence-based librarianship’ would allow LIS professionals to create the intellectual space to challenge our existing assumptions and realities.

Problems with complex social, economic, or ethical aspects such as:
• lack of diversity within the profession,
• scholarly communication and publishing models,
• copyright, intellectual property and piracy,
• technologist vs. humanist approaches to libraries,
• Western-centric approaches to information, knowledge and learning
could be approached with new conceptual, theoretical, and methodological investigations.

So, why is this important to LIS practitioners? Do you ever ask yourself:
• Are we really dealing with the problem here?
• Are we creating value for our community in the long term?
• Why are we paying for these subscriptions anyway?
• What is ‘authoritative’ information (and who says)?
• What about privacy?
• How can we address climate change as an organisation?
• How can I address my own ‘whiteness’ in my day to day professional practice?
• What does our preferred future library even look like?

I do. It is important to me that what I do affects the wider world in a positive way. In a very selfish way, when I go home in the evening I want to be able to tell my children that I do a job that makes the world a better place. If I can’t, I need to change what I’m doing.

Let’s make some connections with others, let’s find some new ways of thinking about solving these problems, because I’m ready, and I want some answers.

This article gives the views of the author and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Considering collaborations

by Margy MacMillan
Mount Royal University Library

Most of my work in Library and Information Practice involves other people so it’s not surprising that working on building and using an evidence base for this work has brought me into close collaboration with people across the library, the campus, and global libraryland*. Reflecting on these experiences has illuminated some patterns and common factors in positive collaborations as well as some aspects that require attention at the beginning to ensure everyone stays friendly at the end.

One of the most important things is to align conceptions of realistic timelines, milestones and deadlines. In one group I worked with, this was evident even in small things – if we said we’d meet in the lobby at 7:30 to catch a shuttle to a conference, we were all there by 7:00. This congruence happened naturally among us, but is something that most groups I’ve been part of have had to work out. While the set dates of publications and presentations can be helpful motivators, developing a schedule that all collaborators are comfortable with should be part of the early planning stages.

Related to the question of time, is motivation. Understanding why your collaborators are interested in the project and how it fits into their lives can help determine feasible timelines. If one partner needs to analyse data as part of planning for a new service and another sees the potential of this analysis to inform wider work through publication, the partners will have to accept different commitment and energy levels for different parts of the project. In situations like these, colleagues and I have often taken the lead at different stages: gathering, initial analysis, submission and write-up. While we all contributed to these stages, leading different parts was an effective way to align aspects of the projects with our skills and motivations, and ensured that no one felt overburdened.

A crucial aspect in both of these collaboration was that we trusted each other to do the work. That trust was built on frank discussions of available time and competing priorities, acknowledgements of each others’ expertise, and shared understanding of tasks and expectations. Looking back those have been key factors in all of the successful collaborations I’ve been a part of.

Nancy Chick, Caitlin McClurg, and the author, collaborating on a cross-disciplinary project.

Nancy Chick, Caitlin McClurg, and the author, collaborating on a cross-disciplinary project.

Openness to others’ expertise is, of course, critical when you are working across disciplinary boundaries. Your partner may be more comfortable in a different research methodology, or simply a different citation style, and developing a shared language around the project is critical. Disciplines bring distinct terminologies and conventions around knowledge creation and dissemination (to see this in action, bring a table of mixed faculty together, open the discussion of author name order, and stand back). These differences affect the questions you ask, the evidence you value, the analysis you undertake and the audience(s) for the final product.Just as you would when coding data, nothing works quite so well as writing down decisions once you find  consensus.  It’s easy (and occasionally disastrous for a project) to make assumptions about shared understandings working with people in your own discipline, but I’ve found these groups can have just as divergent thinking as cross-disciplinary ones. The early communicaiton stage is often skipped on the assumption that as members of the ‘hive mind’ of librarianship we have common conceptions of  information literacy, or what term we should use for patron/user/client/ or how open does a publication need to be to count as OA?.

Much of this: negotiating meaning across disciplines, negotiating time zones and spelling conventions across borders and oceans, or negotiating variations in motivation regardless of other differences or similarities, is a matter of making the tacit explicit, of learning how to say what we mean, what we need, and what we can do clearly and without apology.

It turns out that this really is one of the great unsung benefits of collaboration. Working with others has taught me more about my professional self than any other activity. It has made me think about my values as a librarian, as a researcher, and as a teacher, and in articulating those values to others I have found a strengthened sense of purpose. Negotiating the meaning of information literacy, whether with library colleagues or with other faculty has given me a more nuanced personal definition, and helped me enact and communicate that definition in my teaching and scholarship. I have found that these meaning-making tasks have been far more productive and authentic when I have worked on them as a means to collaboration than when I have considered them as ends in themselves.

Try starting your next collaboration with the kind of conversation that engages participants in self-explanation, where tacit assumptions and definitions are brought into the light of others’ questions, probed for nuance, and made explicit. There is no guarantee this will lead to a trouble-free project of course, but according to the OED ‘explicit’ does derive from the classical Latin explicitus: free from difficulties… so it just might.

*A semi-mythical place where all information is well-organized, all colleagues are congenial and collegial, and timezones prove no barrier to productive conversations.

For a longer discussion of collaboration in research, I highly recommend the “Coda on Collaboration” chapter of Critical Reading in Higher Education: Academic Goals and Social Engagement by Karen Manarin, Miriam Carey, Melanie Rathburn, and Glen Ryland, 2015, Indiana University Press.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

I can’t make bricks without clay: A Sherlock Holmesian approach to Research and EBLIP

by Marjorie Mitchell
Librarian, Learning and Research Services
UBC Okanagan Library

Sherlock Holmes was invoked in the inaugural post of Brain-Work, the C-EBLIP blog, and I would like to revisit Conan-Doyle for the inspiration of this post. So, being a curious librarian, I searched “Sherlock Holmes and research” and came across the heading “What user researchers can learn from Sherlock Holmes.” The author, Dr. Philip Hodgson, took quotes from a variety of the Sherlock Holmes novels and laid out a five-step research process for investigators working in the user-experience field. As I read Dr. Hodgson’s article, it struck me there was a strong kinship between the user-experience community and the library community that extends beyond the electronic world. I also believe Hodgson’s steps provide a reasonable starting point for novice evidence-based library and information practice (EBLIP) researchers to follow.

Step 1 – According to Hodgson, the first step is understanding the problem, or formulating the question. I would adapt it even further and suggest being curious is the very first step. If we’re not curious, we can’t identify what we want to know, or the question we hope to answer. Perhaps a mindset of being open to the discomfort of not knowing motivates researchers to embark on the adventure of inquiry. Once my curiosity has been aroused, I move to formulating a question. Personally, my question remains somewhat fluid as I begin my research because there are times I really don’t have enough information to formulate an answerable question at the beginning of my research.

Step 2 – Collecting the facts, or as I prefer to call it, gathering the evidence, follows. This is one of the juicy, tingly, exciting parts of research. Once I have a question, I think about what information will answer the question. Sometimes simply reading the literature will give me enough of an answer. At other times, I have to go further. Even just thinking about methods can send a shiver of excitement through me. Administering surveys, or conducting interviews, or running the reports from the ILS in hopes it will illuminate some arcane or novel library user behavior are all ways of collecting juicy evidence; it is exciting to see initial results come in and begin to decipher what the results are actually saying. Sometimes the results are too skimpy, or inconclusive, and the evidence gathering net needs to be cast again in a different spot for better results.

Step 3 – Hodgson suggests the next step should be developing a hypothesis to explain the facts you have gathered. This is one step, as much or more than the others, that requires our brain-work. Here we bring our former knowledge to bear on the results and how they relate to the question. It is a time for acute critical thinking as we take the results of our evidence gathering and determine their meaning(s). Several possible meaning may arise at this stage. Hodgson implies it is important to remain open to the multiple meanings and work to understand the evidence gathered in preparation for the next step.

Step 4 – In this step, Hodgson is especially Holmsian. He suggests it is now time to eliminate the weaker hypotheses in order to come closer to a solution. The focus on user experience research is especially strong here. Specific, actionable solutions are being sought to the question identified in the first step. Here he recommends evaluating your evidence to eliminate the weaker evidence in favor of the stronger. He is also cognizant of the need to have solutions that will be affordable and able to be implemented in a given situation. While the whole of this step may not apply to all research, much of it will.

Step 5 – Implementation or action now have their turn. Again, Hodgson is speaking directly to the user experience audience here. However, implementation or action based on research may lead to a decision to not implement or act upon a suggestion. The strength lies in the process around reaching this decision. Questions were asked; evidence was gathered; analysis took place; judgment was applied. As Hodgson pointed out, this is a much better process than proceeding by intuition.

Finally, I would like to add a Step 6 to Hodgson’s list. In order to really know whether the action implemented had the desired effect, or an unintended effect, it is important to evaluate the results of the action or change. In the effort to publish results of research, timeliness is an issue. It is not often possible to have the luxury of the amount of time it would take to be able to measure an effect. However, even in those cases, I am interested in what type of evaluation might take place at a later date. Sometimes researchers will address their future evaluation plans, sometimes they don’t. Even if they aren’t being shared, I hope they are being considered.

This is a simple and elegant plan for research. In its simplicity, it glosses over many of the messy complications that arise when conducting research. That said, I hope this post encourages librarians to follow their curiosity down the path of research.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Open Data and EBLIP – How open are we?

by Pam Ryan
Director, Collections & Technology at Edmonton Public Library

When we talk about evidence based library and information practice (EBLIP), we’re most often talking about research – conducting our own or finding and integrating research and best-available evidence into our practice. Part of this continuum should also include working towards making our own library service and operations data openly available for analysis and re-use.

This isn’t out of line with current library initiatives. Academic libraries have long supported the open access movement and for many, services around managing institutional research data are a current priority. Influenced by open government developments in their municipalities, public libraries are increasingly working to increase open data literacy through programming, encouraging citizens to think critically about government services and learn how to unlock the value of open data.

Why does open data matter for libraries? It aligns with our core values of access to information, sharing, openness, transparency, accountability, and stewardship. It supports our missions to provide information and data literacy, it can provide others with information to help us in our advocacy and value of libraries initiatives, and maybe most importantly, it can fuel research and initiatives we ourselves haven’t yet thought of.

My own place of work has a current business plan goal to: Develop an open data policy that includes how we will use and share our own data; participate in Edmonton’s Open Data community and support data literacy initiatives. We’ve begun to make progress in these areas by developing a statement on open data and collaborating with the City of Edmonton on public programs:

• Edmonton Public Library’s Statement on Open Data:
http://www.epl.ca/opendata

• EPL’s 2014 Open Data Day program:
http://www.epl.ca/odd2014

Has your library started a discussion about what your library’s approach to open data will be?

Further Reading:

Thompson, B, The open library and its enemies, Insights, 2014, 27(3), 229–232; DOI: http://dx.doi.org/10.1629/2048-7754.172

Data is Law / Civic Innovations: The Future is Open http://civic.io/2014/12/28/data-is-law/

pryan@epl.ca / Twitter: @pamryan

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.