Background. The Northern Alliance Hospital Admission Risk Program-Chronic Disease Management comprises 13 services delivering care to those with chronic disease and older people with complex care needs, who are frequent hospital users.
Aims. To develop and implement a system-wide approach to the evaluation of this existing program.
Methods. The Northern Clinical Research Centre audited all existing, routinely collected administrative data within the program and then met with each service to develop service specific outcome measures. The evaluators then developed and implemented a system-wide evaluation approach to measure performance in terms of: client profile; access and entry; service efficiency; client outcomes; and hospital demand.
Results. Data are collected electronically and more than 80% are derived from existing, administrative datasets, minimising staff and client burden. Additional data include client outcomes and a health related quality of life measure. The preliminary twelve month data suggest that clients have the equivalent of 'fair' or 'poor' self-reported health status (n = 862) and the average health utility scores are significantly (P < 0.05) worse than population control data. These analyses reveal, for the first time, that the Northern Alliance Hospital Admission Risk Program-Chronic Disease Management program is targeting appropriate clients.
Discussion. This methodology will enable many prospective assessments to be performed including; client outcome evaluation, service model comparisons, and cost-utility analyses.
Conclusion. This evaluation approach demonstrates the feasibility of a highly coordinated 'whole of system' evaluation. Such an approach may ultimately contribute to the development of evidence-based policy.
What is known about the topic? Program evaluation literature recommends establishing the objectives of a program, and the corresponding evaluation methodology early in the planning phase so that a thorough evaluation can commence with the implementation of the program.
What does this paper add? This paper provides an alternative evaluation methodology developed around the available administrative data, thereby maximising efficiency with data collection and analysis with minimal burden on clinicians. This pragmatic approach may be appropriate for large, ongoing programs with an existing administrative dataset and where funding for evaluation is limited.
What are the implications for practitioners? This paper has implications for both administrators and clinicians. The methodology is designed to facilitate evidence-based policy and planning at a regional and state level, and to assist with quality improvement at the local service level through ongoing performance monitoring and benchmarking.
In 2001, the Victorian Department of Human Services allocated funding to the Hospital Admission Risk Program (HARP) in an endeavour to arrest the growth in emergency department presentations. The HARP model targeted clients at risk of hospitalisation and provided a thorough assessment, planning, management and monitoring service delivered by a multidisciplinary team. Fourteen diverse projects, from five auspicing agencies, were funded in the Northern Region of Melbourne. Initially projects were functionally independent and relatively isolated, with different models of care, target groups, and management structures. Evaluation of the projects was inconsistent, and in many cases lacking altogether.
As HARP progressed, the Department of Human Services commissioned an evaluation and the outcomes of this evaluation were released in 2005.' The main findings were that HARP was successful in reducing hospital demand and in building relationships across the care continuum. The Department of Human Services decided to mainstream HARP, with a commitment to ongoing ftinding. Consequently, HARP projects from the same health region were required to form a single program with unified governance and reporting. …