Principle: Ambiguity in interpreting outcomes or performance measures/indicators of regulatory intervention when also seeking prevention

· Uncategorized
Authors

outcomes-theory-s%20%281%29.png

Principle: Ambiguity in interpreting outcomes or performance measures/indicators of regulatory intervention when also seeking to prevent breaches (Also known as the Chameleon Regulatory Intervention Indicator Principle)

The number of regulatory interventions are often used as outcomes or performance measures/indicators for organizations. This usually occurs in the public sector, but it can also occur in private sector settings where the behavior of individuals is being regulated – for instance, HR issues within a large corporation. In cases where regulatory interventions are being used as outcomes or performance measures/indicators, there can be some ambiguity when attempting to interpret the meaning of an increase or decrease in the number of regulatory interventions being undertaken by an organization or department. On the one hand, an increase can be interpreted as, either: 1) showing that the organization is doing its job well (hence achieving its outcomes); or, 2) showing that the organization is failing to do its job because there is an increase in the behavior it is trying to regulate. Conversely, a decrease in regulatory interventions can be interpreted as, either: 1) showing that the organization is doing its job well (by using other means to prevent the bad behavior which is being regulated – i.e. ‘the regulatory intervention is only a last resort, we are focused on preventing it having to be used’); or, 2) showing that the organization is failing to achieve its mandate of effectively regulating the behavior. This ambiguity will increase to the degree that the organization or department includes within its outcomes the ‘prevention’ of the behavior it is regulating, in addition to it just having a role making straight regulatory interventions.

Because of this ambiguity principle, such changed in such indicators when presented on their own are not interpretable.  In order to interpret them, they need to be put in the context of the visual outcomes model showing the steps in the intervention, what other indicators are being measured and what evaluation questions are being answered in regard to the outcomes model. 

Examples

The Courts’ System is an example where an immediate (as opposed to a long term) preventive outcome is not usually seen as within the Courts’ mandate. Therefore the above principle would suggest that that there is likely to be little ambiguity in interpretation of regulatory interventions going up or down (other things being equal) in the case of Courts.

In contrast, a national department of conservation reported a reduction in the number of times it involved itself in a regulatory process. In this instance, it was the number of times it made legal representations to Conservation Resource Management Consent Hearings. At the time, the department was in the media spotlight for staff and budget cuts.

A media interviewer, interpreting the reduction in regulatory interventions as a failure of the department to effective pursue one of its outcomes, asked the department’s head: ‘…on the face of it, is it a lower priority? [the regulatory intervention – the department getting involved in Conservation Resource Management Consent Hearings]. The department’s chief (interpreting the drop in the measure in the opposite way) replied:

‘What you are falling into is the trap of judging and measuring our success by the number of cases we take regardless of the outcome. We see [Conservation Resource Management Hearings] [the regulatory intervention] as a last resort. We would rather sit down without spending money on lawyers and work out issues if we can and confine the [Conservation Resource Management Hearings] issues to ones that we really can’t reach agreement on’.*

Both sides in this argument are making reasonable ‘face value’ interpretations of the change in the indicator. In order to actually interpret what’s going on with this indicator, further information would be required. For instance, whether there has been an increase in departmental activity focused on getting the parties together prior to potential Resource Management Consent Hearings. The outcomes theory approach to making Chameleon Regulatory Intervention Indicators interpretable is to insist that they are always discussed against a visual outcomes model (e.g. a DoView). When viewed as in the DoView below, a clearer picture emerges of what is may be going on.

d388

Information about the indicator in red would be required in order to be able to interpret the regulatory intervention indicator in black. Even then it could not be absolutely determined, just from the indicator in red, that the department had been successful in reducing the number of contentious issues going to  hearings (which is the desired flow within this outcomes model). So one would really need to answer the evaluation question which also appears in the outcome model.

*’Can DoC Achieve its Core Role on its Current Funding and After its Third Round of Restructuring?’ Radio New Zealand Nine to Noon. 27 March 2013 9:29am http://radionz.co.nz.

First posted: 27 March 2013

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

  1. The building-blocks/types of evidence used in outcomes systems (Redirect)
  2. Types of claims able to be made regarding outcomes models (intervention logics/theories of change) (Redirect)
  3. Reconstructing a Community – How the DoView Visual Planning methodology could be used (Redirect)
  4. Simplifying terms used when working with outcomes (Redirect)
  5. Impact evaluation – where it should and should not be used (Redirect)
  6. Types of economic evaluation analysis (Redirect)
  7. Unequal inputs principle (‘level playing field’) principle
  8. Welcome to the Outcomes Theory Knowledge Base
  9. Organizational Requirements When Implementing the Duignan Approach Using DoView Within an Organization
  10. M & E systems – How to build an affordable simple monitoring and evaluation system using a visual approach
  11. Evaluation of Healthcare Information for All 2015 (HIFA2015) using a DoView visual evaluation plan and Duignan’s Visual Evaluation Planning Method
  12. DoView Results Roadmap Methodology
  13. Problems faced when monitoring and evaluating programs which are themselves assessment systems
  14. Reviewing a list of performance indicators
  15. Using visual DoView Results Roadmaps™ when working with individuals and families
  16. Proving that preventive public health works – using a visual results planning approach to communicate the benefits of investing in preventive public health
  17. Where outcomes theory is being used
  18. How a not-for-profit community organization can transition to being outcomes-focused and results-based – A case study
  19. Duignan’s Outcomes-Focused Visual Strategic Planning for Public and Third Sector Organizations
  20. Impact/outcome evaluation design types
  21. Introduction to outcomes theory
  22. Contracting for outcomes
  23. How a Sector can Assist Multiple Organizations to Implement the Duignan Outcomes-Focused Visual Strategic Planning, Monitoring and Evaluation Approach
  24. How community-based mental health organizations can become results-based and outcomes-focused
  25. Paul Duignan PhD Curriculum Vitae
  26. Integrating government organization statutory performance reporting with demands for evaluation of outcomes and ‘impacts’
  27. Non-output attributable intermediate outcome paradox
  28. Features of steps and outcomes appearing within outcomes models
  29. Principle: Three options for specifying accountability (contracting/delegation) when controllable indicators do not reach a long way up the outcomes model
  30. Outcomes theory diagrams
  31. Indicators – why they should be mapped onto a visual outcomes model
  32. What are Outcomes Models (Program logic models)?
  33. Methods and analysis techniques for information collection
  34. What are outcomes systems?
  35. The problem with SMART objectives – Why you have to consider unmeasurable outcomes
  36. Encouraging better evaluation design and use through a standardized approach to evaluation planning and implementation – Easy Outcomes
  37. New Zealand public sector management system – an analysis
  38. Using Duignan’s outcomes-focused visual strategic planning as a basis for Performance Improvement Framework (PIF) assessments in the New Zealand public sector
  39. Working with outcomes structures and outcomes models
  40. Using the ‘Promoting the Use of Evaluation Within a Country DoView Outcomes Model’
  41. What added value can evaluators bring to governance, development and progress through policy-making? The role of large visualized outcomes models in policy making
  42. Real world examples of how to use seriously large outcomes models (logic models) in evaluation, public sector strategic planning and shared outcomes work
  43. Monitoring, accountability and evaluation of welfare and social sector policy and reform
  44. Results-based management using the Systematic Outcomes Management / Easy Outcomes Process
  45. The evolution of logic models (theories of change) as used within evaluation
  46. Trade-off between demonstrating attribution and encouraging collaboration
  47. Impact/outcome evaluation designs and techniques illustrated with a simple example
  48. Implications of an exclusive focus on impact evaluation in ‘what works’ evidence-based practice systems
  49. Single list of indicators problem
  50. Outcomes theory: A list of outcomes theory articles
  51. Standards for drawing outcomes models
  52. Causal models – how to structure, represent and communicate them
  53. Conventions for visualizing outcomes models (program logic models)
  54. Using a generic outcomes model to implement similar programs in a number of countries, districts, organizational or sector units
  55. Using outcomes theory to solve important conceptual and practical problems in evaluation, monitoring and performance management
  56. Free-form visual outcomes models versus output, intermediate and final outcome ‘layered’ models
  57. Key outcomes, results management and evaluation resources
  58. Outcomes systems – checklist for analysis
  59. Having a common outcomes model underpinning multiple organizational activities
  60. What is best practice?
  61. Best practice representation and dissemination using visual outcomes models
  62. Action research: Using an outcomes modeling approach
  63. Evaluation questions – why they should be mapped onto a visual outcomes model
  64. Overly-simplistic approaches to outcomes, monitoring and evaluation work
  65. Evaluation types: Formative/developmental, process and impact/outcome
  66. Terminology in evaluation: Approaches, types (purposes), methods, analysis techniques and designs
  67. United Nations Results-Based Management System – An analysis
  68. Selecting impact/outcome evaluation designs: a decision-making table and checklist approach
  69. Definitions used in outcomes theory
  70. Balanced Scorecard and Strategy Maps – an analysis
  71. The error of limiting focus to only the attributable
  72. Reframing program evaluation as part of collecting strategic information for sector decision-making
  73. Distinguishing evaluation from other processes (e.g. monitoring, performance management, assessment, quality assurance)
  74. Full roll-out impact/outcome evaluation versus piloting impact/outcome evaluation plus best practice monitoring
  75. References to outcomes theory
  76. Techniques for improving constructed matched comparison group impact/outcome evaluation designs
%d bloggers like this: