Creating Time for Intellectual Creation: Deep Work and Maker Time




The familiar challenge:
We are 6 weeks into summer, and in the pile on our desk about mid-way down is that proposal, paper, course redesign that there has yet to be time for.

Each week offers 40+ hours, yet there can barely be 2 hours of continuous focused worktime strung together. How can this be?

What’s going on:
We have time but how we use it changes the quality of that time for worse or for better. Just as fractures weaken the structure integrity of a beam, or aesthetics transform an object into art, time’s productivity is transformed by our use.

Within computer science and programming there is a distinction between maker time and manager (meeting) time. The first involves chunks of time where one can focus on conceptualizing and working through the depth of a design without surfacing to respond or shift to other topics. Managerial time, conversely, is broken up and additive. One more meeting in a day full of meetings does not have the same cost as a meeting in the middle of block of maker time.

Pair of glasses focussing on the word "focus" in dictionary.

Photo credit – Mark Hunter, CC-By

At a recent SoTL writing retreat, a faculty member commented how it was the first time they had read a full article uninterrupted. It can even take a few hours to settle into comfortable realization that the knock will not come, and the email (turned off) will not ding. By then also having become reacquainted with the project, work starts to build and more is accomplished in a day or two than in weeks past.

This call to create space in our week (or at least in our summer) for focused work is the drumbeat of Cal Newport’s 2016 book Deep Work: Rules for Success in a Distracted World. He posits that creating space and scheduling for deep intellectual work is the act of taking a chunk of time and maximizing productively. In a recent podcast, he speaks of both what is gained and how focus gets weakened by quick novel stimuli like emails, online, social media etc. It is a shift from talking and thinking about a project to conceptualizing and creating.

Synthesis, analysis, conceptualization, and creation (including writing) are higher levels of learning and thinking that requires deeper focus, executive cognitive functioning, more active engagement, and reflection.

So what can I do now?
The GMCTL has 4 Deep Work time opportunities that provide focused time to work with expertise and facilitation as you need it.

  1. SoTL Writing Retreats (July 20, 21 & 22) with uninterrupted time and on-hand consultations for faculty, instructors or staff working toward publishing on a teaching and learning research project. Select a morning or come for 3 days or any combination in between. Register, we will confirm your choice of days and times via email.
  2. Time for Course Design & Prep – In addition to the course design institute offered each year, we offer Drop-in mornings or afternoons. July 26 9:30am – 12:00pm is the next Consultations & Coffee Drop-In Morning. Bring what you have so far including any questions and ideas, and stay for time to work. *Registration is not required.
  3. Book “Deep Work” space for you, your course team or SoTL project colleagues to meet. We can arrange for a space and customize the level of facilitation and support you want from a few minutes to more. Contact GMCTL.
  4. Get unstuck – book a consultation to think through a course idea/challenge, a research design for a SoTL project, or an upcoming term. Contact GMCTL.

Overtime, identify what works for you. Each person’s approach to Deep Work, like preferences in morning beverages, are unique, though with shared key ingredients and conditions to percolate or steep.

Teaching the Language of our Disciplines




Bolded words (those terms highlighted in textbooks), matter for they are the building blocks of every language that allow us to communicate complex ideas, convey how we see the world and shape our questions and ways of engaging with the world to answer our questions.

But words, those collected sets of sounds, do not form a language. The relationships (syntax) and the underlying meetings & ideas (semantics) are necessary for fluency.

We see this in students’ work where the keywords are there, but applied incorrectly or are erratically irrelevant. They may start a sentence with one theorists premise and end it with another’s conclusion without noting the misalignment of essential principles. They apply one formula because a keyword appears in the word problem without seeing that the context prompts a different approach.

Continuing the comparison to learning a language, sure you can get me to memorize how to say “hi my name is…” or “where is the grocery store?” in dozens of languages. And with memorization and practice, I could become quite good at stating those phrases. Maybe through experience or prior knowledge I would know generally when to say each phrase, though I might remain unaware of the informality of “hi” and the specificity of the store. Even with the recitation, I still would not know the language.

Even if I learned to repeat a thousand phrases, I still would now know the language. I would not grasp the differences in formality, tense, or nuance connotations between good and passable. I would not know why “is the water hot” is a question, “the water is hot” is stating a fact, and “the water could be hot” giving a tentative caution. Syntax and Semantics (relationships and meaning) would be invisible, and correct application would be due to a mix of luck and surface level knowledge.

Within the discipline it is the ways we connect, distinguish and extend ideas, facts and bolded words that shape what it means to truly know our disciplines. Deep or expert learning of a discipline requires knowing the meaning, conceptual frameworks, norms, relationship between pieces, and more in addition to the piecemeal building blocks themselves.

When we teach, we need to teach beyond rote phrases, beyond the bolded words. We face the question, what does it mean to teach a discipline that we are often first-language speakers where it is just intuitive.

Curious?

  • Read the key concepts & ways of knowings already identified in your discipline through existing literature on Decoding disciplines (http://decodingthedisciplines.org) or threshold concepts (listed by discipline: threshold concepts examples)
  • Experience the sense of getting stuck. Try learning another discipline yourself to see what is different, learn music or dance for the first time.
  • Seek out colleagues who have experience teaching another language or teaching non-majors.

Finally take inspiration. Google has created a software that can learn and create its own language. We humans have taught generations, let’s teach this next generation the language of our disciplines!

Teaching Students About Research: Open Data = Quality Data with Easy Access



When we teach students research skills and ways of approaching being a researcher, we know that research is more than just plugging in numbers or following a script.

Canadian Open Government Data Lib GuideIn a statistical analysis, being able to select the variables to use (and not use) and the analysis to answer the question is as important as running the analysis.

We want students to design their own questions and analysis. The challenge though is where to get appropriate data easily and ethically?

At the U of S, we are in luck! Our librarians have identified several key Open Data sources:

Canadian Open Government Data
http://libguides.usask.ca/c.php?g=16466&p=91079
Site has 120,000 data sets that are freely available for anyone to use. They are from ten departments: Agriculture and Agri-Food Canada; Citizenship and Immigration Canada; Environment Canada; Department of Finance Canada; Fisheries and Oceans Canada; Library and Archives Canada; Natural Resources Canada; Statistics Canada; Transport Canada and the Treasury Board Secretariat.

  • Canada Open Data Pilot Project – “This pilot portal will make more than 260,000 datasets from the following ten participating departments available to all Canadians: Agriculture and Agri-Food Canada; Citizenship and Immigration Canada; Environment Canada; Department of Finance Canada; Fisheries and Oceans Canada; Library and Archives Canada; Natural Resources Canada; Statistics Canada; Transport Canada and the Treasury Board Secretariat.
” (U of S library guide description)
  • 2011 Census of Canada Web Module
 – Released February 8, 2012
    http://www12.statcan.gc.ca/census-recensement/index-eng.cfm
    includes the Census of Population and the Census of Agriculture.
  • CANSIM – “Cansim is Statistics Canada’s key socioeconomic database of survey data. Updated daily. FREE as of February 1, 2012. License Information: This is an Open Access resource freely available on the Internet. Systematic copying or downloading of electronic resource content is not permitted by Canadian and International Copyright law.
” (U of S library guide description)

United States Open Government Data
http://libguides.usask.ca/c.php?g=16472&p=91152

  • Data.gov
  • White House Open Government Initiative
  • NASA Open Data

These datasets are either exportable or have web portal access to aggregated data. Contact your Librarian to learn more and for Government data, contact Rob Alary at Data Library Services: robert.alary@usask.ca

Have a question about teaching research design, or an exciting way to use Open Data in your course? Connect with me at the GMCTE or carolyn.hoessler@usask.ca

(Thank you to Darlene Fichter, U of S Library, for providing feedback and up-to-date information)

Hands up! How We Increase (Or Decrease) Student Participation




We design courses with many opportunities for students to learn by completing assignments, readings and answering questions in class. But does our teaching increase such behaviours or decrease them?

One lens, psychology of learning, suggests we likely do both. B. F. Skinners’ operant conditioning suggests that how we respond to student behavior can either increase (reinforce) or decrease (punish) our students actions including participating in class discussion or completing homework.

What is Operant conditioning?

As Thorndike’s Law of Effect and B.F. Skinner’s operant conditioning note we are influenced by the consequences of our actions. Good consequences encourage more of this activity, while unpleasant (or unhelpful) consequences encourage less of this activity.

Reinforcement increases the frequency of behaviours through either the addition of a pleasant stimulus (positive reinforcement) or the removal of an unpleasant stimulus (Negative reinforcement).

Punishment” decreases the frequency of valued behaviours through either the addition of an unpleasant stimulus (positive punishment) or the removal of an unpleasant stimulus (negative punishment).

What about Encouraging students to answer questions in class:

Hands up! How We Increase (Or Decrease) Students Answering QuestionsWe might beneficially use punishment to decrease of disruptive behaviours such as disruptive side conversations, interrupting classmates, or answering cell phones by adding the unpleasantness of awkwardness when we stand near by, interrupt to redirect conversation, or let silence fall during the phone call.

Our effect may also be neutral leading to attenuation where the lack of a reward results in decreased responses, including when an instructor neither confirms or discounts the response and simply says “next” until they have 3 responses regardless of correctness.

Over time, behaviours do not need to be (and should not be) actively reinforced each time to maintain higher participation or lower skipping class (see information on schedules and fixed versus variable intervals and ratios).

Experiment!

Try seeing how the number of students’ answers increases (or decreases) with different responses. Predict via the lens of operant conditioning. For example:

  • What happens if I ask questions that are too easy? -> Students likely not rewarded by answering.
  • What happens if I ask questions that are too hard? ->Students might not be able to answer and receive the explicit or implicit feedback that they are wrong.
  • What happens if I present my answer(s) on a slide after I ask them? Students might not be rewarded by answering
  • But what if I skim by pointing out all the parts they identified and building on their answer? -> Students might be rewarded and increase participation.
  • What if I summarize the readings? -> Students who read now have the frustration of listening again and having “wasted time” while students who did not read are reinforced that their decision was correct.
  • What if I have them pull out the readings or use a specific page or section for an activity -> Students who read ware rewarded by not having to quickly skim, students who did not read might experience uncertainty or struggle.

Applying operant conditioning is not about “coddling” or saying “good try” without correcting flawed knowledge, but creating a learning experience that is encouraging of participation, reading and incorporating feedback into later performance. Even when a students’ answer is incorrect there are ways to reward behaviours that lead to improvement (e.g., asking questions) and provide feedback to modify that knowledge by “rewarding” the correct bits, “punishing” incorrect parts, and because we can speak better than pigeons, suggesting how to improve.

While it is useful to be cognizant of how our actions may act to encourage or discourage specific student behaviours, self-determination is still valued and people may not want themselves or others to be treated as treating people like lab rats such as by Sheldon on The Big Bang Theory:

Resources:

Why Google Can’t Replace Good Teaching




The internet contains more facts, pictures and formulas than any human mind, yet we do not see it as “smart” and it can sometimes feel like we are stumbling in a jungle. Last year’s estimate placed it at 136 billion pieces of 8×11 paper and there are more pages now. In its amazing stack of human content, there are thousands of pages on each statistical test, recent political event, written work and human experience. No shortage of information. But “knowing” requires more than access to or repeating of stacks of information.

What separates a novice from an expert is the richness of details, meaningful connections, identification of significant features that differentiate and inform, and the overall complex organizational structure that tells us the relative importance of a given fact, the other relevant factors, and thus what page to choose among the thousands. Just watch a students’ first literature search (or walk through the library stacks), or my online search yesterday about downspout options.

stacks

Picture via Flickr by Fly under a CC-BY license.

Guided development of expertise, and having someone who “knows” describe the overall framework, highlight key features and reveal the decision making process is what is missing when we simply load a browser and hit search.

In order to be more to our students, than the internet or any repository of facts could be, good teaching is about modeling and build such expertise that shapes the essential ways of thinking and seeing the world that are the core of our disciplines.

We, as experts, teaching novices have the challenge of making what comes naturally to us as our usual mode of transportation, explicit, tangible, and possible for our students who are novices in our disciplines. How?

  • Model the process of decision-making and analysis as explicitly as you can. Practice and ask a colleague from outside your discipline (or one of us in the GMCTE) to listen and ask why.
  • Highlight the key distinguishing features, where you see them in the example, what they relate to, how you make sense of them, and what you now know and thus can decide or do.
  • Show the overall framework of the area with its many connections, and refer to both the framework and the connections across specific topics, facts, authors, images, formulas, approaches, time periods and more.
  • Draw on their own expertise and help link new material with existing connections by asking them to create their own maps, lists of steps or mnemonics

Resources on Novice & Expert

Academic Resource Centre, Duke University, Experts vs. Novices: What Students Struggle with Most in STEM Disciplines (summary) http://arc.duke.edu/documents/Experts%20vs%20Novices_STEMcourses.pdf

Ambrose, S., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How Learning Works: Seven Research-Based Principles for Smart Teaching.   Chapter 2: How does the way students organize knowledge affect their learning? (pp. 40-65). San Francisco, CA.: Jossey-Bass.

Bransford, J.D., Brown, A.L., & Cocking, R.R. (2004). How People Learn: Brain, Mind, Experience and School. Washington D.C.: National Academy of Sciences.

First Day of Class: Providing students a relevant and engaging initial taste




Sessions related to this topic will be held during the Fall Fortnight:

  • Why Teach With Top Hat? (Monday, August 22, 2016 from 10-10:25 AM) – Register Here
  • Building Student Capacity for Effective Group Work (Monday, August 22, 2016 from 1-4 PM) – Register Here 
  • Preparing & Personalizing Your Syllabus (Tuesday, August 23, 2016 from 1-2:50 PM) – Register here
  • Exploring Methods for Preventing & Detecting Plagiarism (Wednesday, August 24, 2016 from 10-11:30 AM) – Register Here
  • Attention & Memory: Increasing Student’ Learning (Friday, August 26, 2016 from 9-10 AM) – Register Here
  • Assignments, Rubrics, and Grading in Blackboard – It’s Easier Than You Think (Thursday, September 1, 2016 from 3-4:30 PM) – Register Here

As people, our perceptions and routines are engrained early on for places and people. Our experiences and decisions shaped by earlier ones. Find yourself or your students returning to their same seat? Initially excellent (or poor) meals at a restaurant tinting later dining experiences? Buying a book when the first few sentences catch your attention?

The first day of class is similar: it sets the focus of the course; instills interest; sets the context including layout and participation levels; invites selecting (or dropping) a class; and builds initial credibility and approachability.

As the recent Chronicle Vitae post by Kevin Gannon (Director of the Center for Excellence in Teaching & Learning and Professor of History at Grand View University) noted just reading the syllabus and ending class is not enough. You miss modeling how you expect students to engage in the classroom, and missing the opportunity for intriguing questions and enthusiasm for the topic. As Kevin Gannon notes “Whatever your plan for the first day, students should get some idea of what’s expected of them throughout the semester, and also have the opportunity to discern their place in the class and its activities.”

One example of engaging students in the content of the course and the level of participation and thinking you expect to see, is highlighting one of the essential ideas of the course in a way that is immediately relevant to students. Create that initial, individual, motivating connection to what they will learn.

keyholeAllow your students to glimpse the potential and be curious about where it will lead.

Highlighting a core focus of the course

1) Identify key principles, lessons or concepts that are foundational to your course

  • Resources: decoding the disciplines & threshold concept literature in nearly every discipline offers ideas to select from
  • Stats class example: ability to read descriptions of quantitative research to identify and critique reported and missing components of data description and analysis
  • Qualitative research class example: all forms of reporting involve speaking for the other person, including choosing what is conveyed and how.

2) Convey the hook of curiosity & why it is inherently meaningful to them.

  • Tip: Each discipline arose to explore and work on essential and important inquiry. As experts in the discipline, the importance is obvious but often hard to articulate to a novice.
  • Resources: Reflect on what makes this topic inherently so important and relevant. Not sure how to describe it? Try talking it out with a colleague outside your discipline (or one of us in the GMCTE).
  • Stats class example: a sample online news report of a relevant hot topic based on “research”
  • Qualitative research class example: meet and then introduce a classmate

3) Connect what students already know with what they will learn

  • Tip: Having a clear framework early on in a course allows students to organize their new ways of thinking and new information. Referring to the example, diagram, key ideas throughout the course reinforce them and help to encode new memories.
  • Resource: The meaning step in 4MAT approach to lesson planning. Attention and memory literature. There is an upcoming fortnight session on Friday August 26th.
  • Stats class example: identifying the statistical & research pieces in the news story provided and missing pieces. The initial step for critiquing.
  • Qualitative research class example: experiencing the sense of responsibility and uncertainty when speaking for another person (especially when they are sitting beside you). Wondering did they say enough or too much? Were they accurate or misinterpreted? And then connecting it with the key idea of self & research within qualitative research methods.

Photograph courtesy of David Hetre under a CC-BY license.

Co-authoring Take 2: A co-authored post about co-authoring



Co-written with Shannon Lucky, Library Systems & Information Technology
Earlier this year I excitedly read Shannon Lucky’s post on Co-authoring from April 21, 2015 on Brain-Work, sparking a chance to respond, connect and collaborate. In our discussion about co-authoring we captured a wide range of questions to ask and strategies that seemed to fit with the framework of the 5 Basic Elements of Cooperative Learning that we adapted to co-write this blog post. To see Shannon’s description of our collaboration visit http://words.usask.ca/ceblipblog/2015/08/27/co-authoring2/ where the following information is cross-posted on C-EBLIP.

– Carolyn

Co-authoring and collaborative research can be personally rewarding and can strengthen a project by tapping into multiple perspectives and disciplines. It can also be difficult and frustrating at times but conflicts can be minimized, or avoided altogether, through planning and clear communication.
The following checklist is based on the five basic elements of cooperative learning developed by Johnson, Johnson, & Johnson Holubec. Each element is defined and lists questions you should answer as a group and tips to keep in mind as your work progresses. These questions can feel uncomfortable or may lead to conflict, but it is better to have these hard conversations early and to sort out any impasses before it is too late. Sometimes collaborating with someone just doesn’t work and it can be better to identify these situations early and walk away on good terms rather than having a project fall apart mid-way through when lots of time, energy, and resources have already been invested.
Communicate early! Communicate often!

A good collaborative team needs:

1. Positive Interdependence – having mutual goals, pursue mutual rewards, and need each other to be successful.
Ask:

  • What are my goals for the project and what are my co-author’s goals?
    • This can include the number of publications you will write, the venue and format of publication, and timelines.
  • What am I bringing to this project and what are other in the group bringing?
    • Talk about your work style and preferences, personality, Myers-Briggs types, StrengthsFinders, what bugs you about working in groups – anything that will help your group get to know each others preferred work styles.
  • Can the project be easily divided so that everyone has a defined task?
    • Doing the literature review, editing, analyzing, referencing, etc.
  • What will each of our roles on the project team be and will they be static or rotating?
    • Note taking, coordinating meetings, synthesizing/pulling together ideas, etc.
  • What will the author order be or how else will author contribution be recognized?
    • How is this determined and is everyone in agreement?

Tips:

  • Know thyself – figure out what has bothered you about past collaborations and what has worked well. Communicate this clearly to your team members and ask them what works and does not work for them. Be honest and upfront about your expectations.

2. Face-to-Face Promotive Interaction – reading each other’s expressions or tone and have positive interactions
Ask:

  • How can we meet face-to-face, in the same room or using technology?
    • Especially important when working at a distance. We must interact with each other in ways that avoid misunderstandings or assumptions and build consensus/respected distinction?
  • How frequently should we meet and how will these meetings be arranged?
    • Are all meetings planned at the start of the project? Who is required at the meetings and who will organize and lead them? When will they occur?
  • What will our meetings look like?
    • Will they be for planning and checking in on individual progress, working meetings, or discussion and co-creation focused?
  • What is the length of our project?
    • Confirm what collaborator and able and willing to commit to in advance. Situations can change, but having a rough expectation for required time and contribution to the group can help with contingency planning if need be.
  • How will we create a good rapport and welcoming environment for the group?
    • Whose job is it to set the tone? The meeting host and the content lead for the discussion don’t have to be the same person.

Tips:

  • Pay attention to discussions happening over email and other non face-to-face interactions to ensure that positivity, respect, and encouragement is maintained.
  • Make sure everyone in the group is included in discussions so no one becomes isolated or siloed in their piece of the project. This recommendation does not preclude small task groups or subgroups, but communication should be forefront.

3. Individual Accountability – each person knowing what they need to do, is able to do it, and does it on time.
Ask:

  • What are the deliverables?
  • What are realistic timelines for me? For my co-author(s)?
  • What are our external deadlines?
    • e.g., special issue deadlines, external reviewer, conferences, personal deadlines
  • What will we do if we fall behind or need to step back?
    • Anticipate setbacks and plan contingencies.

Tips:

  • Make individuals accountable to the group and their collective goals, rather than to a single individual leader. Allow the weight of several people relying on and expecting each piece to prompt action. Also reduces the tracking and chasing of the leader.
  • Make sure there is an explicit link between author order and contribution to the project or ensure another type of recognition for all authors.

4. Interpersonal And Small Group Skills – having the conflict-management, leadership, trust-building, and communication skills to build a well-functioning group
Ask:

  • What skills do we already have in our group for leadership, conflict-management, facilitation etc.? What gaps exist and how can we fill them?
    • This can mean adding a person or finding external support such as hiring a copyeditor.
  • What roles do we all want to play on this project?
    • Take care to consider each person’s workload and other projects they are involved with. You might not want to be the lead researcher or editor for multiple projects are the same time.
  • What is my bandwidth for contributing to this project?
    • Note if this is likely to change during the lifecycle of the project and how this will impact the group.

Tips:

  • See what skill development opportunities are available in your area.
  • Co-authoring might be an opportunity to either observe or practice a new skill

5. Group Processing – continuing to be a well-functioning group, checking in regularly and using the skills from element #4.
Ask:

  • What points of coherence and dissonance have we identified as a group?
    • How do our personalities in element #1 work together or against each other?
    • How will we deal with disagreements?
    • What is the plan when individuals do not fulfill their part of the project?

Tips:

  • Revisit your roles and decisions periodically as a group.
  • Build time to reflect and discuss the project into your meetings or schedule time specifically for this activity.
  • Identify one next step or a change to improve your project and/or your work dynamic.
  • Celebrate your successes!

Sources:
Johnson, David W., Roger T. Johnson, and Edythe Johnson Holubec.Cooperation in the Classroom. Edina: Interaction Book, 1991. Print.

Evaluating Presentations With a Little Help From My (Citable) Friends …




Individual and group presentations provide great opportunity for students to share what they have learned with peers and an efficient and feasible way of marking for instructors.

That being said, how do you grade them?

I, and I’m pretty sure you too, have experienced the full range of presentations from the stunningly excellent to the staggeringly confusing, from the inspirational to the sleep-inducing. The challenge is describing these qualities so they can be identified and assessed.

One option would be to create my own rubric based on these experiences.

The easier option is to use or adapt existing materials from others I respect.

The first source I turn to is the well-respected Association of American Colleges & Universities’ VALUE (Valid Assessment of Learning in Undergraduate Education) assessment initiative, which has created 16 rubrics including one for oral communication.

They define oral communication as “a prepared, purposeful presentation designed to increase knowledge, to foster understanding, or to promote change in the listeners’ attitudes, values, beliefs, or behaviors” and assess it according to five criteria: organization, language, delivery, supporting material and central message. The rubric describes requirements for each criterion across 4 levels. For example, Capstone (4) level delivery requires that presenters’ “Delivery techniques (posture, gesture, eye contact, and vocal expressiveness) make the presentation compelling, and speaker appears polished and confident.”

The second resource I consider is the more detailed (15 criteria across 3 categories) rubric of the American Evaluation Association’s Potent Presentation Initiative (p2i). Their website has every resource I ever wished to send to students (and perhaps others) about what “good” presentation or posters look like. They have posted rubrics, guidelines, templates and resources for regular slides presentations, ignite presentations (20 slides x 15 seconds = 5 minutes of auto-advancing slides), posters and handouts.

In addition I could ask a colleague or see what other courses in the program are using.

presentation outline

One of the best parts of adapting rubrics is the opportunity to decide which pieces I find most important for my course (e.g., organization), ones that are relevant if revised to be more specific (e.g., supporting material) and ones that are not (e.g., mastery – speaking without reading from notes). I also can decide which resources I recommend (e.g., p2i slide design guidelines), which I comment on (e.g., I suggest noting times if you use the p2i rundown template) and which I just mention (e.g., p2i presentation preparation checklist).

When uncertain I can always ask for a second opinion from a colleague, request a consultation, or trial it before posting the criteria.

Happy assessing!

Picture courtesy of Sean MacEntee and carries a Creative Commons Attribution license.

A Lesson In “Not Imposssible”




Each year about this time I start to imagine what if this year things were different. The possibilities of lively discussions, great feats of learning, and engaged students arise with excitement and then that doubting voice drifts in…

But what if it was not impossible?

What if the needs I perceived in my students and in my goals could be met?

Inspiration is offered by Mick Ebeling and the Not Impossible labs through their work that began when they heard of a boy named Daniel, an individual with a particular need, and then asked the question “what if this is not impossible?”. After pulling together a team to think through and test possibilities, they found a solution that worked. The created solution not only gave Daniel, an arm it is changing a community – “help one, help many.”

Meet Daniel or hear Mick speak of creating an eyewriter for graffiti-artist TEMPT

What if what will help one will help many? Who is your Daniel?

My Daniel is Sara (pseudonym), the 2nd year psych student I taught back in 2005 who wanted to learn statistics and needed to succeed in her honour’s program, but was afraid of math. I began developing explanations for Sara (e.g., is the differences between groups larger than the difference within groups?). The rationale and analogies allowed Sara to see the function of a t-test, know when to use it, and know what information is needed first and then what math to use to compute a t statistic. My approach became to link a statistical test’s purpose, function, and formula by pairing the mathematical formula on the board with a description of its purpose – specifically, why this test might be used. I then illustrate the functional patterns that I would look for noting the variables of the formula they represent.

For example, I would show the formula for a Pearson correlation and describe its purpose of determining to what extent two variables are related. The Pearson correlation functions by detecting co-occurrence, such as when survey respondents answer two questions similarly. If variable A is rated highly, is variable B? The analogy I use is pair dancing – if they are in sync they get higher ratings (closer to 100 or in this case 1.0). The math reflects this purpose and function by the numbers it uses and generates.

The resulting cognitive framework scaffold of purpose-function-math not only helped Sara to face and complete assignments, it helped her classmates compare statistical tests and better understand when to select them. Since then, I have continued to develop materials and teach sessions locally, nationally, and internationally on statistics for individuals with no or little background in math and stats. The approach that was designed to meet Sara’s needs now benefits many.

A key part to the Not Impossible Lab’s success is identifying a collection of others who have the knowledge or skills needed to make the impossible not impossible. Mark needed knowledge of prosthetics and access to equipment. I talked with individuals who disliked and struggled with statistics, as well as those used statistics daily. What do you need? And who among our community of excellent educators, librarians, course (re)designers, curriculum discussion facilitators, mentors, students, and the wider community, will be on your team?

I would love to learn of your experiences of Not Impossible and wrestling with the “seems impossible”!

Problem Solving = Great! But what kind of problems are our students really learning?




What learning are we really asking our students to demonstrate, and what are we saying actually matters through our assessments?

Within statistics, exams require students to apply statistical procedure such as t-tests to questions e.g., is there a significant difference between boys and girls on self-confidence or neural activity when the mean is… where the criteria of significance is typical, the problem to solve is clear and familiar, the variables are provided, and even the values are given. Just plug into memorized equations. In contrast, what if I was to ask on assignments (for practicing) and the exam questions such as presenting a news story and asking students to outline the information and statistical analyses they would engage in to take a stance.  They might then have to look up prior studies to find likely values, debate whether gender is dichotomous categories or a continuous variables for their purposes, and determine how to operationalize the topic, set a 1/20 or more conservative cut off for significance, and select and apply a statistical analysis. Which assessment would better measure the learning I would want my students to have when they go forward? Which learning would you want that A+ to represent when you are deciding if they will be your honours or graduate student?
Problem solving process

Several years have passed since I heard Dr. Eric Mazur speak of changing the activity in the classroom to engage students in learning and increase their conceptual understanding of physics. His approach of peer instruction is well shared. The video was included in an earlier blog post about participatory learning and transfer)

In the June 2014 opening plenary of the Society for Teaching and Learning in Higher Education conference in Kingston, Dr. Eric Mazur’s pursuit of improving learning has remained but his focus had shifted:

“For 20 years I have been working on approach to teaching, never realizing that assessment was directing the learning … assessment is the silent killer of learning.”

As educators, we do not teach so that students simply learn the concept, lens, or procedure for tomorrow, but for the days and weeks that follow. If delaying an exam one day disadvantages students or achieving a high grade cannot predict understanding of the fundamental concepts of force, he asked have they really learned? If the assessments only reflect and demand a low level of learning, then our students will not learn at the higher levels that we desire them to achieve. Do exams that promote cramming or could be answered with a quick Google search really measure the type of transfer or retention of information that we really should be aiming for?

Of several changes that Dr. Mazur outlined to improve assessment, the one that really caused me to pause was his comment about what kind of problems are we asking students to solve.

Think of the problems typically found in your field – the ones where the outcomes are desired but the procedure and path to get there is not known (e.g., design a new mechanism, identify the properties of what is before them, or write a persuasive statement). However, in our assessments, as Dr. Mazur contrasted, the problems students are asked to solve involves applying known procedures to a set of clearly outlined information to solve for an unknown outcome.

During the plenary, he presented a series of possible questions asking about estimating the number of piano tuners: the first version required students to make assumptions about frequency and populations, then to reduce students’ questions and uncertainty the second version provided the assumptions, the third the name of the formula and so on until the students were simply asked to remember the formula and input numbers…moving down the levels of Bloom’s taxonomy from creativity and evaluation to simple remembering.

Add in the removal of the resources that I would reach for when running a statistical analysis or citing for a journal article, and the removal of collaboration and consultation that my research enjoys but not my teaching of research, and the distinction between the reality I think I am preparing students for and the exam become more disparate.

The question is how pre-defined and easily remembered or repeated is the “information” students are being asked to identify, note as missing and connect.

Resources

Video: Asking Good questions, Humber College http://www.humber.ca/centreforteachingandlearning/instructional-strategies/teaching-methods/course-development-tools/blooms-taxonomy.html
Asking questions that foster problem solving based on Bloom Taxonomy

Bloom’s Taxonomy, University of Victoria
http://www.coun.uvic.ca/learning/exams/blooms-taxonomy.html
Lists example verbs and descriptions for each competence level

Bloom’s taxonomy, www.bloomstaxonomy.org
http://www.bloomstaxonomy.org/Blooms%20Taxonomy%20questions.pdf
Question stems and example assignments

Plugin by Social Author Bio