Bridging the Gulf: Mixed Methods and Library Service Evaluation

Article excerpt

This paper explores library evaluation in Australia and proposes a return to research fundamentals in which evaluators are asked to consider the centrality of philosophical issues and the role of different research methods. A critique of current evaluation examples demonstrates a system-centred, quantitative, input/ output focus which fails to engage with users' experiences or to address service objectives, and provides insufficient information upon which to make decisions about service improvements. This reliance on traditional evaluation is juxtaposed with the emphasis library and information studies places on alternative methodologies. The paper asks why a perspective that has become mainstream in academic research remains peripheral to practice-based evaluation.


What is evaluation?

EVALUATION HERE IS CONCERNED WITH EFFICACY AND THE EXTENT TO WHICH OBJECTIVES have been met within the provision of social amenities (Greene, 2000); most often using research methods to question people about a service they have recently experienced (Pawson & Tilley, 1997). Although some consider evaluation to be the third and final step in a linear model: needs assessment > planning > evaluation (Witkin, 1994), others regard it as a part of each step, seeing the development of services as a more cyclical, reflective process in which evaluation is combined with on-going needs assessment and planning.

Hernon distinguishes between the terms 'assessment' and 'evaluation' explaining that assessment is the process of gathering data, while evaluation is the final stage in which the data is interpreted and 'value judgments' are made (2001, pp94-95). This paper will argue that value judgments are made throughout the entire process of needs assessment, planning, service provision, assessment and analysis, and that these judgments strongly affect the nature of the service as well as the means of evaluating it. Therefore the term 'evaluation' will be used broadly with the implication that all aspects are underpinned by a set of assumptions and corresponding values--often at an unconscious level.

Evaluation methodology has attracted considerable attention from a number of disciplines (particularly psychology, social policy, sociology and education) and has been shown to be a highly flexible form of applied research. Depending on the approach used, evaluation can be either narrow or broad in focus, formulaic or dynamic, imposed or participative, purist or eclectic. Although many papers in library and information studies (LIS) recognise these issues and use evaluation methods to address program, service or system provision, there is a lack of discussion about evaluation itself as an information behaviour. Furthermore, the ideas that dominate LIS academic debate and research are not evident in the research that libraries conduct themselves. For the most part it seems that practitioners conduct 'evaluation', often contracting out the whole or parts to market research companies, while academics conduct 'research'. There appears to be gulf between the two practices and the ideas which inform them, despite the huge numbers of evaluations which are continually taking place within the very communities LIS research aims to improve. This is a deeply problematic situation in a discipline which is committed to 'excellence in professional service to our communities' (ALIA, 2002, point 6).

Why evaluate library services?


The principles which inform our discipline require us to engage in on-going service improvement in order to achieve the excellence which ALIA advocates and the 'high standards of provision and delivery of library and information services' outlined by the International Federation of Library Associations & Institutions (IFLA 2003a, section: Aims). Evaluation assists decision-making by providing information about service strengths and weaknesses, indicating where successes can be built on and areas where significant improvements could or should be made, highlighting service gaps and suggesting new directions. …