Reflections on the discussion period of the TransformUS task force reports

As the two of us reflect on the last eight weeks of conversation about TransformUS, a couple of points stand out.

The first is the wide interest in our community regarding the TransformUS process. Some 28,000 individuals appear to have viewed the web pages, watched the live streaming of town halls or attended meetings in person. A smaller number, 800, made individual, group or anonymous comments. The comments were diverse, and in many cases quite lengthy. Overall the tone was professional and constructive.

A second salient point is that those who submitted comments generally did so to express various concerns. Some are afraid for the consequences for programs in which they work or study, or from which they have graduated. Some are upset with the possible impacts of an open and public prioritization process on reputation and attitudes. Others worry that an imperfect process or flawed data might determine critical decisions that affect us all. Behind these concerns, we perceive a deep attachment to the University of Saskatchewan – a sense of respect, love and ownership. The same concerns seem to have motivated those who wrote in or spoke to support the process and to support the need to make choices.

We hear and understand these signals. We sense people’s deep attachment to our university and its historic programs and services. What we can do about the concerns is to prepare the best possible implementation recommendations – ones that are judicious and that best safeguard or enhance important programs and characteristics of our institution.

Following the release of the TransformUS task force reports to the campus community on 9 December 2013, President Busch-Vishniac invited any and all comments during the eight weeks that followed. A 48-page analytical and thematic summary of the commentary is now available.

The summary of feedback indicates a significant volume of response, with diverse modes and topics. The largest numbers of responses were those that criticized or contested the process; criticized or contested specific academic or support-service rankings; pointed to issues in interpretation of data; or drew attention to the degree of participation by students.

Given that the consultation period was about the task force reports, it is striking how relatively few comments engaged the overall reports, and particularly the themes identified by each task force in their respective executive summaries. We take it as a sign of the anxieties associated with an era of budget constraints that discussion focused on specific aspects of process and detailed recommendations. This was perhaps reflective of a desire to figure out what it would all really mean in practice for areas of the university.

We note that a degree of feedback centred on matters we feel are speculative or largely unfounded. We see no meaningful signs that the reports are biased, though they do have some patterns that are worth thinking about (see the preliminary analysis of selected aspects of the reports). We do not agree that a budget process can be based on discipline-specific peer review, because in Canada budgets involve allocation among the programs within a single university. Again, we take these and other concerns as saying that people want to make sure that decisions honour and respect the actual accomplishments of programs and services.

We note interesting discussions that our academic councils might undertake concerning the uses of three-year degrees and the significance of small programs. The task forces and various discussants have highlighted these issues.

We accept the criticisms and suggestions about the data and process in the spirit that while such things can never be perfect in organizational decision-making, they can be better. All effective organizations know how to make decisions under conditions of incomplete information; and they also work to improve and share the information available.

Finally, we would like to repeat two observations we have made about the task force reports. First, they are the work of several dozen of our colleagues who spent hundreds of hours and considerable thought and care to produce the first-ever comprehensive, simultaneous comparison of all of our programs and services in terms of priority. The reports are unique in the history of our university and are sure, in our view, to be influential.

But second, the reports are not decisions, only stages in a process. Nothing will happen automatically or arbitrarily. Assisted by the feedback from the discussion phase, by follow-up discussions with college and unit leaders, and by targeted additional analysis, PCIP will now be working on recommendations that will go to governing bodies and decision-makers. The two of us are committed to doing this work with respect, sensitivity and professionalism that honour the cares and concerns that have been expressed by our community.

Brett and Greg

View the summary of feedback from the TransformUS feedback and consultation phase

9 thoughts on “Reflections on the discussion period of the TransformUS task force reports

  1. Thank you for your question, Kevin. Comments made as part of the discussion at the January Council meeting were included within the themes outlined in the summary of TransformUS feedback. This document does not include all comments – it is an overview of themes we saw within the feedback with selected comments included to support these themes. Where possible, verbatim quotes were used to illustrate this feedback and written quotes were used where possible as it is much more difficult to capture verbatim comments at a public meeting. In addition, Council committees submitted feedback that is included in the February 2014 Council package and has be reposted on the TransformUS website at http://words.usask.ca/transformus/analysisrecommendations/data-analysis/.

  2. There’s something odd about this report.

    Paragraph 2 of section 3.3 says that “the analysis included an attempt to identify the overall frequency of a theme. Specifically, groups of quotes were used as exemplars for the themes that occurred most often while less common themes were presented as a list.”

    Section 4.1 provides that “Themes are presented in relation to those that occurred most to least frequently.”

    The comments excerpted in 4.2.1 are apportioned as follows (number of exemplars appears in partheses after each category:
    Task Force Critique (5)
    Size Bias (6)
    Template Critique (8)
    Dickeson Model Critique (7)
    Not Peer Review Process (5)
    Student Consultation and Participation (13)
    Deficit/Financial Stability (8)
    Morale (4)

    That works out to an average of just under 7 exemplars per category. (With great representation re: student consultation!)

    Now let’s have a look at section 4.2.2. (Endorsement of Process). To quote: “Although the majority of feedback focused on critiquing various aspects of the TransformUS process, there were also a handful of respondents who provided favorable comments and who felt it was a necessary process to carry out.”

    We then see 7 exemplars in this category–slightly ABOVE the average for exemplars per category in 4.2.1. That number seems high given the reference to only “a handful” of such replies.

    In order to have a full understanding of how the exemplars across these two sections manifest the principles laid out in section 3.3 and 4.1, could we please be told how many comments were made in each of the categories represented here (and in the lists)? Were the 13 quotes about student consultation and participation selected from among 130 contributions? 1300? Were the 7 quotations endorsing the TransformUS process selected from among 70 such contributions? 700? What was the actual frequency of comments made in each of these categories?

    A breakdown of the data along these lines would contribute to the transparency of this process.

    • Curious, “RealTransformUS”: When reflecting on input and comments, the weight of a particular idea and its frequency are not always in direct proportion.

      • As per 3.3, the report attempted to identify the frequency of themes. That frequency was then not made clear in the report. Just an oversight, no doubt, but why not provide that data since the report invoked it in such a conspicuous way and in fact says pretty clearly that the report is organized around the frequency of themes? It’s the REPORT that invokes frequency, not me.

        One benefit of seeing that data would also be to provide a basis for thinking about which themes were more likely to be communicated via which media. That could be useful information in the future in terms of other such processes (in order to target the solicitation of opinion from various quarters).

        • RealTransformUS: The purpose of sharing the summary of feedback document was to provide a high-level overview of the types of feedback PCIP received and is considering. It is being shared with the campus community in the interest of transparency and as one piece PCIP is taking under consideration in the development of recommendations. Early on in the analysis process we had hoped to provide some quantitative data with regards to feedback received, but we soon realized that it would be nearly impossible to do so as the qualitative data we had to work with did not lend itself to such analysis. I would like to share an example to provide context. In some cases we know we had individuals commenting on the same topic using a number of different modes (town halls, in tweets, by email and in letters to the Star Phoenix). In other cases we couldn’t identify who was providing the feedback and were unable to confirm if the feedback was provided in duplication. Therefore, as Professor Urquhart indicates, frequency of a topic and the weight cannot be considered in direct proportion.

  3. This is a very interesting document, and it offers a fascinating snapshot of the many comments/questions on the program prioritization process from various directions.

    That said, conspicuous by its absence is any reference to remarks made in University Council. Given that this body will eventually vote on recommendations from the administration, this omission seems somewhat odd. Can you please explain how it is that no remarks from Council meetings made their way into the report?

    Thank you.

Leave a Reply

Your email address will not be published.