An argument for transdisciplinary research for the library and information professions

by Tegan Darnell, Research Librarian, University of Southern Queensland

Put simply, ‘transdisciplinary’ research draws on work from a number of different disciplines to approach a problem or question in a holistic way, but it is distinct from other cross-disciplinary methodologies in that it describes research that attempts to interrogate space across, between, or beyond the disciplines.

Interdisciplinary, cross-disciplinary, and multidisciplinary research remain inside the framework of disciplinary research. A library study that uses a method such as ethnography is one example of ‘interdisciplinary’ research, for example. A ‘transdisciplinary’ approach is one that attempts to understand the wider world in a way that is not possible within disciplinary research.

Transdisciplinary research is a way of attempting to understand and address the complexities of those ‘wicked’ multi-faceted problems that involve human beings, nature, technology and society. Climate change, artificial intelligence, poverty, and health are all areas where transdisciplinary studies are beneficial.

As LIS professionals, we are working in a field that is at the intersection between people, technology, ethics, information, and learning. Allowing ourselves to abandon the rigid ways of thinking established within disciplines such as education, information science, and perhaps even the term ‘evidence-based librarianship’ would allow LIS professionals to create the intellectual space to challenge our existing assumptions and realities.

Problems with complex social, economic, or ethical aspects such as:
• lack of diversity within the profession,
• scholarly communication and publishing models,
• copyright, intellectual property and piracy,
• technologist vs. humanist approaches to libraries,
• Western-centric approaches to information, knowledge and learning
could be approached with new conceptual, theoretical, and methodological investigations.

So, why is this important to LIS practitioners? Do you ever ask yourself:
• Are we really dealing with the problem here?
• Are we creating value for our community in the long term?
• Why are we paying for these subscriptions anyway?
• What is ‘authoritative’ information (and who says)?
• What about privacy?
• How can we address climate change as an organisation?
• How can I address my own ‘whiteness’ in my day to day professional practice?
• What does our preferred future library even look like?

I do. It is important to me that what I do affects the wider world in a positive way. In a very selfish way, when I go home in the evening I want to be able to tell my children that I do a job that makes the world a better place. If I can’t, I need to change what I’m doing.

Let’s make some connections with others, let’s find some new ways of thinking about solving these problems, because I’m ready, and I want some answers.

This article gives the views of the author and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Considering collaborations

by Margy MacMillan
Mount Royal University Library

Most of my work in Library and Information Practice involves other people so it’s not surprising that working on building and using an evidence base for this work has brought me into close collaboration with people across the library, the campus, and global libraryland*. Reflecting on these experiences has illuminated some patterns and common factors in positive collaborations as well as some aspects that require attention at the beginning to ensure everyone stays friendly at the end.

One of the most important things is to align conceptions of realistic timelines, milestones and deadlines. In one group I worked with, this was evident even in small things – if we said we’d meet in the lobby at 7:30 to catch a shuttle to a conference, we were all there by 7:00. This congruence happened naturally among us, but is something that most groups I’ve been part of have had to work out. While the set dates of publications and presentations can be helpful motivators, developing a schedule that all collaborators are comfortable with should be part of the early planning stages.

Related to the question of time, is motivation. Understanding why your collaborators are interested in the project and how it fits into their lives can help determine feasible timelines. If one partner needs to analyse data as part of planning for a new service and another sees the potential of this analysis to inform wider work through publication, the partners will have to accept different commitment and energy levels for different parts of the project. In situations like these, colleagues and I have often taken the lead at different stages: gathering, initial analysis, submission and write-up. While we all contributed to these stages, leading different parts was an effective way to align aspects of the projects with our skills and motivations, and ensured that no one felt overburdened.

A crucial aspect in both of these collaboration was that we trusted each other to do the work. That trust was built on frank discussions of available time and competing priorities, acknowledgements of each others’ expertise, and shared understanding of tasks and expectations. Looking back those have been key factors in all of the successful collaborations I’ve been a part of.

Nancy Chick, Caitlin McClurg, and the author, collaborating on a cross-disciplinary project.

Nancy Chick, Caitlin McClurg, and the author, collaborating on a cross-disciplinary project.

Openness to others’ expertise is, of course, critical when you are working across disciplinary boundaries. Your partner may be more comfortable in a different research methodology, or simply a different citation style, and developing a shared language around the project is critical. Disciplines bring distinct terminologies and conventions around knowledge creation and dissemination (to see this in action, bring a table of mixed faculty together, open the discussion of author name order, and stand back). These differences affect the questions you ask, the evidence you value, the analysis you undertake and the audience(s) for the final product.Just as you would when coding data, nothing works quite so well as writing down decisions once you find  consensus.  It’s easy (and occasionally disastrous for a project) to make assumptions about shared understandings working with people in your own discipline, but I’ve found these groups can have just as divergent thinking as cross-disciplinary ones. The early communicaiton stage is often skipped on the assumption that as members of the ‘hive mind’ of librarianship we have common conceptions of  information literacy, or what term we should use for patron/user/client/ or how open does a publication need to be to count as OA?.

Much of this: negotiating meaning across disciplines, negotiating time zones and spelling conventions across borders and oceans, or negotiating variations in motivation regardless of other differences or similarities, is a matter of making the tacit explicit, of learning how to say what we mean, what we need, and what we can do clearly and without apology.

It turns out that this really is one of the great unsung benefits of collaboration. Working with others has taught me more about my professional self than any other activity. It has made me think about my values as a librarian, as a researcher, and as a teacher, and in articulating those values to others I have found a strengthened sense of purpose. Negotiating the meaning of information literacy, whether with library colleagues or with other faculty has given me a more nuanced personal definition, and helped me enact and communicate that definition in my teaching and scholarship. I have found that these meaning-making tasks have been far more productive and authentic when I have worked on them as a means to collaboration than when I have considered them as ends in themselves.

Try starting your next collaboration with the kind of conversation that engages participants in self-explanation, where tacit assumptions and definitions are brought into the light of others’ questions, probed for nuance, and made explicit. There is no guarantee this will lead to a trouble-free project of course, but according to the OED ‘explicit’ does derive from the classical Latin explicitus: free from difficulties… so it just might.

*A semi-mythical place where all information is well-organized, all colleagues are congenial and collegial, and timezones prove no barrier to productive conversations.

For a longer discussion of collaboration in research, I highly recommend the “Coda on Collaboration” chapter of Critical Reading in Higher Education: Academic Goals and Social Engagement by Karen Manarin, Miriam Carey, Melanie Rathburn, and Glen Ryland, 2015, Indiana University Press.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

I can’t make bricks without clay: A Sherlock Holmesian approach to Research and EBLIP

by Marjorie Mitchell
Librarian, Learning and Research Services
UBC Okanagan Library

Sherlock Holmes was invoked in the inaugural post of Brain-Work, the C-EBLIP blog, and I would like to revisit Conan-Doyle for the inspiration of this post. So, being a curious librarian, I searched “Sherlock Holmes and research” and came across the heading “What user researchers can learn from Sherlock Holmes.” The author, Dr. Philip Hodgson, took quotes from a variety of the Sherlock Holmes novels and laid out a five-step research process for investigators working in the user-experience field. As I read Dr. Hodgson’s article, it struck me there was a strong kinship between the user-experience community and the library community that extends beyond the electronic world. I also believe Hodgson’s steps provide a reasonable starting point for novice evidence-based library and information practice (EBLIP) researchers to follow.

Step 1 – According to Hodgson, the first step is understanding the problem, or formulating the question. I would adapt it even further and suggest being curious is the very first step. If we’re not curious, we can’t identify what we want to know, or the question we hope to answer. Perhaps a mindset of being open to the discomfort of not knowing motivates researchers to embark on the adventure of inquiry. Once my curiosity has been aroused, I move to formulating a question. Personally, my question remains somewhat fluid as I begin my research because there are times I really don’t have enough information to formulate an answerable question at the beginning of my research.

Step 2 – Collecting the facts, or as I prefer to call it, gathering the evidence, follows. This is one of the juicy, tingly, exciting parts of research. Once I have a question, I think about what information will answer the question. Sometimes simply reading the literature will give me enough of an answer. At other times, I have to go further. Even just thinking about methods can send a shiver of excitement through me. Administering surveys, or conducting interviews, or running the reports from the ILS in hopes it will illuminate some arcane or novel library user behavior are all ways of collecting juicy evidence; it is exciting to see initial results come in and begin to decipher what the results are actually saying. Sometimes the results are too skimpy, or inconclusive, and the evidence gathering net needs to be cast again in a different spot for better results.

Step 3 – Hodgson suggests the next step should be developing a hypothesis to explain the facts you have gathered. This is one step, as much or more than the others, that requires our brain-work. Here we bring our former knowledge to bear on the results and how they relate to the question. It is a time for acute critical thinking as we take the results of our evidence gathering and determine their meaning(s). Several possible meaning may arise at this stage. Hodgson implies it is important to remain open to the multiple meanings and work to understand the evidence gathered in preparation for the next step.

Step 4 – In this step, Hodgson is especially Holmsian. He suggests it is now time to eliminate the weaker hypotheses in order to come closer to a solution. The focus on user experience research is especially strong here. Specific, actionable solutions are being sought to the question identified in the first step. Here he recommends evaluating your evidence to eliminate the weaker evidence in favor of the stronger. He is also cognizant of the need to have solutions that will be affordable and able to be implemented in a given situation. While the whole of this step may not apply to all research, much of it will.

Step 5 – Implementation or action now have their turn. Again, Hodgson is speaking directly to the user experience audience here. However, implementation or action based on research may lead to a decision to not implement or act upon a suggestion. The strength lies in the process around reaching this decision. Questions were asked; evidence was gathered; analysis took place; judgment was applied. As Hodgson pointed out, this is a much better process than proceeding by intuition.

Finally, I would like to add a Step 6 to Hodgson’s list. In order to really know whether the action implemented had the desired effect, or an unintended effect, it is important to evaluate the results of the action or change. In the effort to publish results of research, timeliness is an issue. It is not often possible to have the luxury of the amount of time it would take to be able to measure an effect. However, even in those cases, I am interested in what type of evaluation might take place at a later date. Sometimes researchers will address their future evaluation plans, sometimes they don’t. Even if they aren’t being shared, I hope they are being considered.

This is a simple and elegant plan for research. In its simplicity, it glosses over many of the messy complications that arise when conducting research. That said, I hope this post encourages librarians to follow their curiosity down the path of research.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Open Data and EBLIP – How open are we?

by Pam Ryan
Director, Collections & Technology at Edmonton Public Library

When we talk about evidence based library and information practice (EBLIP), we’re most often talking about research – conducting our own or finding and integrating research and best-available evidence into our practice. Part of this continuum should also include working towards making our own library service and operations data openly available for analysis and re-use.

This isn’t out of line with current library initiatives. Academic libraries have long supported the open access movement and for many, services around managing institutional research data are a current priority. Influenced by open government developments in their municipalities, public libraries are increasingly working to increase open data literacy through programming, encouraging citizens to think critically about government services and learn how to unlock the value of open data.

Why does open data matter for libraries? It aligns with our core values of access to information, sharing, openness, transparency, accountability, and stewardship. It supports our missions to provide information and data literacy, it can provide others with information to help us in our advocacy and value of libraries initiatives, and maybe most importantly, it can fuel research and initiatives we ourselves haven’t yet thought of.

My own place of work has a current business plan goal to: Develop an open data policy that includes how we will use and share our own data; participate in Edmonton’s Open Data community and support data literacy initiatives. We’ve begun to make progress in these areas by developing a statement on open data and collaborating with the City of Edmonton on public programs:

• Edmonton Public Library’s Statement on Open Data:

• EPL’s 2014 Open Data Day program:

Has your library started a discussion about what your library’s approach to open data will be?

Further Reading:

Thompson, B, The open library and its enemies, Insights, 2014, 27(3), 229–232; DOI:

Data is Law / Civic Innovations: The Future is Open / Twitter: @pamryan

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.