Skip to main content Accessibility help
×
Home
  • Print publication year: 2004
  • Online publication date: June 2018

11 - Evaluating your performance

from Part 2 - Skills and resources for evidence-based information practice

Summary

Introduction

The final stage of evidence-based practice is evaluating your own performance (Figure 11.1). This has two aspects: firstly, on a technical level:

  • • How have you performed with regard to the stages of evidence-based practice?
  • • Did I ask a specific focused question?
  • • Did I find efficiently the best evidence to answer my question?
  • • Did I evaluate the evidence reliably according to validity and usefulness?
  • • Did I apply the results of the research appropriately to a specific user or group of users?
  • Evaluating your performance in this context helps you become a better evidencebased practitioner.

    Secondly and more importantly, has the service that you introduced or modified as a result of undertaking the evidence-based process actually made the anticipated difference? Evaluating your performance in this context helps you become a better evidence-based practitioner. As Todd (2002) identifies: ‘evidence- based practice thus has two important dimensions. First, it focuses on the conscientious, explicit and carefully chosen use of current best research evidence in making decisions about the performance of the day-by-day role. Second, evidence-based practice is where day-by-day professional work is directed towards demonstrating the tangible impact and outcomes of sound decision making and implementation of organizational goals and objectives’.

    Evaluation will consider both direction and degree; did the intervention have the planned effect (as opposed to the opposite effect) and did the effect have the expected magnitude? It may also lead you to redefine the original problem (Figure 11.1).

    Because change strategies involve organizational and individual factors, differences between what you anticipate and what actually happens can have various causes:

  • • differences between the political, cultural or economic environment in the published study and that in which you are operating (applicability differences)
  • • differences between the technologies employed in the published study and those available to you locally (intervention differences)
  • • differences between the morale, motivation and commitment of staff in the published study compared with those locally (motivational differences).
  • Related content

    Powered by UNSILO