Book contents
- Frontmatter
- Contents
- List of figures
- List of tables
- Preface
- Acronyms and abbreviations
- 1 Background
- 2 Theoretical considerations
- 3 User satisfaction
- 4 Impact on users
- 5 Social and economic impact
- 6 Inputs
- 7 Processes
- 8 Outputs
- 9 Staff
- 10 Infrastructure
- 11 Services for all
- 12 Benchmarking
- 13 The balanced scorecard
- 14 Standards
- Appendix 1 Data collection methods
- Appendix 2 The analysis of data
- Appendix 3 The presentation of results
- Index
- Frontmatter
- Contents
- List of figures
- List of tables
- Preface
- Acronyms and abbreviations
- 1 Background
- 2 Theoretical considerations
- 3 User satisfaction
- 4 Impact on users
- 5 Social and economic impact
- 6 Inputs
- 7 Processes
- 8 Outputs
- 9 Staff
- 10 Infrastructure
- 11 Services for all
- 12 Benchmarking
- 13 The balanced scorecard
- 14 Standards
- Appendix 1 Data collection methods
- Appendix 2 The analysis of data
- Appendix 3 The presentation of results
- Index
Summary
■ Introduction
Ultimately, libraries and information services are judged by the effects of what they do. If they can demonstrate that people have become more knowledgeable or have developed new or improved skills from contact with the service, then the message to their constituencies and their funders is positive – and likely to be reinforced by the resources needed to sustain and develop those services. If there is no evidence available of any beneficial effects, then they are vulnerable to questions over their purpose and the value of maintaining them. It is not surprising that questions ofimpact therefore loom large. In this chapter the focus is on qualitative methods of assessing impact; in the next chapter the focus shifts to the question of economic impact.
It is worth noting again that terminology in this area is somewhat fluid. In particular, the termoutcomes is sometimes used, especially in the USA, in place ofimpact. This explains why many US libraries and programmes concern themselves with outcome-based evaluation (OBE) – see, for example, the web pages of the Institute of Museum and Library Services (IMLS) at www.imls.gov/applicants/obe.shtm. Even in the UK there is ambiguity. For example, the Inspiring Learning for All website (www.inspiringlearningforall.gov.uk/), designed for supporting public libraries in their learning activities, talks of ‘Tools to assess the impact of learning’ on its home page, providing a link to a page which doesn't mention ‘impact’ at all but instead addresses the question ‘What are learning outcomes?’! One of the reasons for this is that in education it is usual to talk ofoutcomes rather thanimpact: so, for example, when the Department for Education and Skills commissioned an evaluation of the University forIndustry (UfI), the resulting report was entitled:Tracking Learning Outcomes: evaluation of the impact of the UfI. In this book the convention of treating ‘outcomes’ as the more immediate effects and ‘impacts’ as those of the longer term is observed.
Impact assessment has become particularly important since governments introduced much more explicit policy agendas for libraries and their parent bodies. For example, throughout the world there has been high level policy concentration on matters such as lifelong learning and digital inclusion. It is not surprising, therefore, that individual libraries are called on to demonstrate that they are contributing to the achievement of strategic goals in these areas.
- Type
- Chapter
- Information
- Measuring Library Performanceprinciples and techniques, pp. 54 - 76Publisher: FacetPrint publication year: 2006