In June 2024, the Board released its Analysis of Scopes of Practice Consultation Feedback.

At the outset, it is noteworthy that in comparison to many public consultations, the responses reported in this analysis are overwhelmingly contrary to the proposals.  Additionally, despite a strong response-rate, the author notes that many respondents indicated being unable to answer some questions because of the way it was framed and many more simply skipped multiple questions.

Authorship

The Board indicated this was an independent analysis but contrary to convention in psychology, no authorship was recorded on the document. At initial review, without knowing the authorship, along with their qualifications and expertise in qualitative analysis of a data set of this size, specificity and complexity, it is difficult to have confidence in how well this document summarised the concerns and perspectives of 1349 survey respondents and as many stakeholders as are represented in 53 email submissions.

In response to a later inquiry made by colleagues, the Board did confirm that the analysis was completed by a (named) accountancy firm. That confirmation provides context for there being no scholarly statement of methodology and for other aspects of the data reporting being divergent from discipline norms.

Executive Summary

  • Unclear what is meant by “The response rate to individual survey questions tended to be (italics added) between 59 and 65%.” Does this mean the response rate to individual survey questions varied between these percentages?  Were some questions answered by almost all respondents and did some questions have very low response rates?  If so, which questions? Given that open fields were provided, did respondents provide any clues to explain non-response?
  • Notes analysis of both email submissions and survey responses, and that the majority of themes were common across both, but does not explain the rationale for reporting these data sources separately.
  • No indication of the types of stakeholder groups represented by email responses or the numbers of members/employees/clients they may have represented. For example, the analysis does not refer to the submissions on behalf of employers, unions, professional groups or other collectives. Thus a response from one individual may have been treated as equivalent to a response representing several, or several hundred individuals.

The executive summary fails to clearly identify the key findings which might more accurately have identified:

  • Consultation process – many respondents expressed dissatisfaction, citing ineffective communication and lack of collaboration, as well as a desire to have more involvement.
  • Need for change – while some respondents indicated appreciation of the Board’s efforts, the overall sentiment was clearly negative; specifically there were clear concerns that the need for change had not been established, nor explanation and justification of the proposed changes provided.
  • Proposed Scope Structure – overwhelming negative feedback with themes including that the proposal was overly restrictive, did not reflect the realities of psychological practice and would result in reduced access to care, increased administrative burden and an increase in referrals.
  • Kaupapa Māori Scope – while there was agreement on the importance of culturally competent practice, perspectives were mixed as to whether a separate scope would improve this or result in a decrease of availability of services to Māori.
  • Impact on the Public – in addition to reduced access and increased referrals, there are concerns that the restrictive nature of the proposed scopes would reduce availability of psychological services.
  • Tertiary Education – concerns that the proposed scope structure would place additional strain on the university training programmes, increase costs, limit ability to train new psychologists and negatively impact University capacity to contribute to the ongoing development of the profession.

Introduction

p. 6. Asserts that “Many psychologists in good faith have developed and expanded their practices beyond the scope of practice in which they are registered”. It is surprising to see this assertion in the analysis of consultation feedback. If the statement were proven, it would mean that those many psychologists have, knowingly or not, been practicing illegally and presumably doing so with the Board’s knowledge. But no evidence is provided for this assertion. If the author is quoting that assertion from a Board document, they have not signalled that (and it is not appropriate in any event).

Method Overview

It is important to note that the Board had not indicated in advance the methodology that would be used to analyse the results but in psychology, reporting on qualitative data typically captures both range and volume of agreement in response content/themes. There are clear distinctions among what can be concluded from a question which elicits: responses over a wide range of themes and little alignment; or a high level of agreement over a small range of themes; or even a bimodal pattern of wide-ranging responses along with significant agreement around a particular theme.

However, this section is not of the standard expected in psychology.  It is clear that the data has not been analysed by any formal qualitative method that would be recognised in psychology. It is neither a formal thematic analysis (which is subject to analyst bias) nor even a formal content analysis.  It does not explain how the data were coded nor themes derived.

Essentially, the sections of the report addressing email and survey responses do not appear to present qualitative analysis so much as the author’s impressions illustrated with cherry-picked quotes. Although the inclusion of numerous quotes provides the reader with a sense that the analytic commentary is an accurate synthesis, given the lack of methodological framework, the reader has no assurance that is so.

Presumably many of the email submissions represented groups.  However there is no indication of the numbers represented by these submissions. For example, in addition to the submissions on behalf of the New Zealand College of Clinical Psychologists and the New Zealand Psychological Society, collegial sharing of submission documents indicates several submissions prepared collaboratively by committees on behalf of constituent groups (to address their specific priority concerns). These groups varied in size with one numbering over 550. These submissions represent a large proportion of the entire profession. None of them supported the proposed scopes framework and all were critical of the process.

The original consultation document and survey are neither appended to the report nor linked in the Method. In most fields of psychology this would be expected so that the reader has ready access to the original documents should they wish to check the author’s interpretation of responses in relation to how a specific issue was framed.

Limitations

Evidently, respondents took the time to offer many specific suggestions.  Unfortunately, on the basis of “large volume and specificity”, the author notes that generally they were not included in the analysis.

Finding some exact phrases repeated across the dataset, the author characterises some responses as “repeated or collusive”. Use of such a pejorative term appears unprofessional, but more importantly, it indicates a lack of understanding of how that may have come about. In order to understand and think through the consequences of the proposals, many groups of practitioners discussed the consultation document together. Collaboration is after all, a core competency and as practitioners have a responsibility to think about how any proposal might affect the client group/s they serve, it seems reasonable that they would work together and that if there were a high degree of agreement, indeed, there would be repetition of phrases across the dataset. As some respondents evidently pointed out, many found the proposal to be unclear and/or insufficiently evidenced, and the survey poorly constructed rendering it difficult to frame meaningful responses. Those conditions result in making response a very time-consuming undertaking which may also have contributed to time-poor practitioners pooling their efforts.

Conclusion

Overall, the conclusion does not follow logically from the body of the report.

In fact, it appears that the conclusion itself is biased. For example, the equivocal phrase “tended to be negative” is simply inaccurate. Even of the necessarily cherry-picked comments included, less than 10% could be characterised as being positively supportive of the proposals. Referring to the potential for negativity bias in surveys is disingenuous when the analysts have not provided any information about numbers of respondents represented in the various email responses nor analysis of the numbers of individual respondents endorsing particular themes. For example, Section 3.2.14.1 might represent only 1, 2 or 3 practitioners and Section 3.2.11.2 might represent 5 or 500 practitioners.

The author then claims to have distilled the themes into three points which conveniently leave aside many key issues reported on the preceding pages.

A more accurate conclusion might have indicated that feedback with respect to the proposal of a Kaupapa Māori scope was mixed, but otherwise the feedback represents a predominantly negative sentiment with key concerns pertaining to:

  • clear call for evidence for the alleged problem, justification for the proposed changes and evidence that they would effectively solve this problem
  • appropriateness and restrictiveness of the proposed scopes
  • consultation process: not collaborative and lacked effective communication including lack of clarity about elements of the proposal
  • impact on public (decreased access and clarity) and professionals (additional strain on the workforce and division in the profession)
  • tertiary education strain from both ability to train new professionals and contribute to the development of the profession.