Evaluating Information and Knowledge Services Using Narrative Techniques; an Approach That Engages Stakeholders and Encourages Them to Share Their Experiences and Suggest Improvements Can Complement Traditional Evaluation Methods and Provide a Clearer Indication of a Library's Overall Impact
Hart, Nerida, Schenk, Mark, Information Outlook
In 2005, amid a turnover in leadership and the development of new strategic directions, rumors began circulating that Australia's Department of Families, Community Services and Indigenous Affairs (FaCSIA) was going to undergo a review to identify operating inefficiencies. Previous experience with such reviews suggested that the department's Knowledge & Information Services (K&IS) Section would likely be one of the first areas evaluated, as library and research services are often viewed as costs rather than assets.
The K&IS Section had invested more than eight years of effort in transforming the library into a more service-oriented function. The focus of the transformation was on identifying and meeting the information, knowledge and research requirements of clients and demonstrating the value of librarians as knowledge brokers. Client and staff satisfaction levels had skyrocketed during this period.
Now, the work of eight years might be undone by the stroke of an auditor's pen. Rather than wait for an external review to be imposed, the K&IS management team decided to evaluate the section's services to identify the value and benefits they provide.
This article describes the combination of methods used to conduct the evaluation and looks specifically at the narrative-based method, as it represents an innovative approach. The authors wish to emphasize that narrative-based approaches complement traditional evaluation approaches--they do not replace them.
Developing a Strategy
The K&IS Section provided library, information, knowledge and research services to FaCSIA and three other related government departments. The K&IS leadership team was concerned that traditional evaluation methods focusing on usage data and customer opinion would be of limited use in convincing the department's management to continue investing in library, information and knowledge services at their existing levels. Previous exposure to complexity concepts convinced the team that while their data could reveal inputs and outputs, alternative methods were required to get a clear indication of the overall impact of K&IS services.
Figure 1, known as the Cynefin framework, provides a useful model for describing how complexity thinking affects organizations (Snowden and Boone 2007) and how such thinking relates to evaluation. In a nutshell, many problems have aspects of each domain, and each domain requires a different approach.
[FIGURE 1 OMITTED]
The simple and complicated domains are said to be "ordered" views of the world. These domains help us understand inputs and outputs and provide answers to questions such as these:
* How much did it cost?
* How long did it take?
* How many stakeholders are affected?
* How satisfied are the stakeholders?
Contrasting with the ordered perspective is the "un-ordered" view, where there is no single correct answer and meaning emerges through the interaction of many entities. In the complex and chaos domains, control is an illusion, facts don't tell the full story, and different methods are required to make sense of events and circumstances. This is where narrative approaches come to the fore.
The K&IS leadership team had seen how a narrative-based approach, used in conjunction with more traditional methods, provides an excellent vehicle for qualitative evaluation. They realized a narrative-based approach could be used to shed more light on the value of services provided to client agencies.
Ultimately, the leadership team designed an evaluation strategy that would provide evidence not only of the extent of usage of library and research services but also of the library's contribution to the overall productivity of client agencies. To meet these goals, three approaches were selected:
Quantitative analysis. Data were collected on a range of key inputs and outputs, such as the number of research requests, the number and type of requests made by each agency, the amount of time needed to fulfill requests, and the costs to use research databases. …