Problem Solving = Great! But what kind of problems are our students really learning?




What learning are we really asking our students to demonstrate, and what are we saying actually matters through our assessments?

Within statistics, exams require students to apply statistical procedure such as t-tests to questions e.g., is there a significant difference between boys and girls on self-confidence or neural activity when the mean is… where the criteria of significance is typical, the problem to solve is clear and familiar, the variables are provided, and even the values are given. Just plug into memorized equations. In contrast, what if I was to ask on assignments (for practicing) and the exam questions such as presenting a news story and asking students to outline the information and statistical analyses they would engage in to take a stance.  They might then have to look up prior studies to find likely values, debate whether gender is dichotomous categories or a continuous variables for their purposes, and determine how to operationalize the topic, set a 1/20 or more conservative cut off for significance, and select and apply a statistical analysis. Which assessment would better measure the learning I would want my students to have when they go forward? Which learning would you want that A+ to represent when you are deciding if they will be your honours or graduate student?
Problem solving process

Several years have passed since I heard Dr. Eric Mazur speak of changing the activity in the classroom to engage students in learning and increase their conceptual understanding of physics. His approach of peer instruction is well shared. The video was included in an earlier blog post about participatory learning and transfer)

In the June 2014 opening plenary of the Society for Teaching and Learning in Higher Education conference in Kingston, Dr. Eric Mazur’s pursuit of improving learning has remained but his focus had shifted:

“For 20 years I have been working on approach to teaching, never realizing that assessment was directing the learning … assessment is the silent killer of learning.”

As educators, we do not teach so that students simply learn the concept, lens, or procedure for tomorrow, but for the days and weeks that follow. If delaying an exam one day disadvantages students or achieving a high grade cannot predict understanding of the fundamental concepts of force, he asked have they really learned? If the assessments only reflect and demand a low level of learning, then our students will not learn at the higher levels that we desire them to achieve. Do exams that promote cramming or could be answered with a quick Google search really measure the type of transfer or retention of information that we really should be aiming for?

Of several changes that Dr. Mazur outlined to improve assessment, the one that really caused me to pause was his comment about what kind of problems are we asking students to solve.

Think of the problems typically found in your field – the ones where the outcomes are desired but the procedure and path to get there is not known (e.g., design a new mechanism, identify the properties of what is before them, or write a persuasive statement). However, in our assessments, as Dr. Mazur contrasted, the problems students are asked to solve involves applying known procedures to a set of clearly outlined information to solve for an unknown outcome.

During the plenary, he presented a series of possible questions asking about estimating the number of piano tuners: the first version required students to make assumptions about frequency and populations, then to reduce students’ questions and uncertainty the second version provided the assumptions, the third the name of the formula and so on until the students were simply asked to remember the formula and input numbers…moving down the levels of Bloom’s taxonomy from creativity and evaluation to simple remembering.

Add in the removal of the resources that I would reach for when running a statistical analysis or citing for a journal article, and the removal of collaboration and consultation that my research enjoys but not my teaching of research, and the distinction between the reality I think I am preparing students for and the exam become more disparate.

The question is how pre-defined and easily remembered or repeated is the “information” students are being asked to identify, note as missing and connect.

Resources

Video: Asking Good questions, Humber College http://www.humber.ca/centreforteachingandlearning/instructional-strategies/teaching-methods/course-development-tools/blooms-taxonomy.html
Asking questions that foster problem solving based on Bloom Taxonomy

Bloom’s Taxonomy, University of Victoria
http://www.coun.uvic.ca/learning/exams/blooms-taxonomy.html
Lists example verbs and descriptions for each competence level

Bloom’s taxonomy, www.bloomstaxonomy.org
http://www.bloomstaxonomy.org/Blooms%20Taxonomy%20questions.pdf
Question stems and example assignments

Leave a Reply

Your email address will not be published. Required fields are marked *