Getting to Impact: Using the Evaluation Standard to Measure Results

By Rowell, Saundra | Journal of Staff Development, Spring 2007 | Go to article overview
Save to active project

Getting to Impact: Using the Evaluation Standard to Measure Results

Rowell, Saundra, Journal of Staff Development

I serve on a local nonprofit foundation board that awards grants to community organizations that sponsor programs to discourage teen pregnancy. Over the years, the foundation has developed research-based rubrics for scoring grant applications. But recently, board members noted that agencies seeking funding don't describe the impact of their program on the youth they serve.

During a visit to one of the foundation's grant recipients, I was reminded of NSDC's standard on evaluation: Staff development that improves the learning of all students uses multiple sources of information to guide improvement and demonstrate its impact.

The agency I visited has had a program for nine years that focuses on helping high-risk students with personal development, health plans, healthy dating habits, and pregnancy prevention. Staff are genuinely engaged and concerned about young people, but the organization has no real assessment of how much difference it is making. The agency had recently implemented a pre- and post-test, but the results provided only anecdotal evidence of what was occurring as a result of the staff's efforts.

The organizations funding depends on whether it can demonstrate the need and value of its program because the foundation is carefully scrutinizing and rating applicants to determine where to award grant money. The foundation now asks applicants, "What outcomes do you hope to achieve with this program? How are these outcomes measured?" The foundation wants specific and measurable outcomes, data measuring stated outcomes, and evidence that the agencies have analyzed and improved their efforts based on data. The foundation now looks for strong evidence (e.g. specific numbers) that the program is successful and has achieved its proposed outcomes. In the current environment, schools are undergoing similar scrutiny with calls for increased accountability. As professional developers, we can lead the march toward effective program evaluation, beginning with our own staff development programs.

Although professional developers are aware of NSDC's Standards for Staff Development (NSDC, 2001), many of us running professional development programs do not evaluate whether we are getting results - the impact of what teachers are learning on student learning.

Professional development leaders are capable of doing far more evaluation than we do. All programs can be evaluated, although perhaps not all at the same level of sophistication. Evaluation is a matter of intentionality. Just a few short years ago, many educators were frightened by the idea of examining student data. Central office evaluation staff usually explained student data to school staffs. Now, more educators are proficient in examining student data after learning through professional development what data means and how to analyze it. Our next emphasis should be on learning to create effective evaluations using the various data we collect.

Many who run school improvement programs, educational workshops, and professional development activities, just as in the case of community initiatives, use evaluation terminology but still rely largely on anecdotal evidence. Paper and pencil pre- and post-tests are an improvement from the absence of any evaluation, but these tests often fail to determine to what extent the skills learned were transferred to new practices that lead to improved achievement - the effect of the program on the ultimate goal of student learning.

In 2000, after a series of projects led by Joellen Killion, NSDC's director of special projects, NSDC concluded that leaders of most staff development programs fail to plan for and to evaluate the impact of programs on student learning. Killion immersed herself in evaluation theory and research, going beyond educational evaluation to examine cutting-edge evaluation work in the medical and community organization fields.

Killion (2002, p. 132) said almost anyone could begin to evaluate program effectiveness by developing evaluation think, which she defined as a frame of reference, a mindset, a set of analytical skills.

The rest of this article is only available to active members of Questia

Sign up now for a free, 1-day trial and receive full access to:

  • Questia's entire collection
  • Automatic bibliography creation
  • More helpful research tools like notes, citations, and highlights
  • Ad-free environment

Already a member? Log in now.

Notes for this article

Add a new note
If you are trying to select text to create highlights or citations, remember that you must now click or tap on the first word, and then click or tap on the last word.
Loading One moment ...
Project items
Cite this article

Cited article

Citations are available only to our active members.
Sign up now to cite pages or passages in MLA, APA and Chicago citation styles.

Cited article

Getting to Impact: Using the Evaluation Standard to Measure Results


Text size Smaller Larger
Search within

Search within this article

Look up

Look up a word

  • Dictionary
  • Thesaurus
Please submit a word or phrase above.
Print this page

Print this page

Why can't I print more than one page at a time?

While we understand printed pages are helpful to our users, this limitation is necessary to help protect our publishers' copyrighted material and prevent its unlawful distribution. We are sorry for any inconvenience.
Full screen

matching results for page

Cited passage

Citations are available only to our active members.
Sign up now to cite pages or passages in MLA, APA and Chicago citation styles.

Cited passage

Welcome to the new Questia Reader

The Questia Reader has been updated to provide you with an even better online reading experience.  It is now 100% Responsive, which means you can read our books and articles on any sized device you wish.  All of your favorite tools like notes, highlights, and citations are still here, but the way you select text has been updated to be easier to use, especially on touchscreen devices.  Here's how:

1. Click or tap the first word you want to select.
2. Click or tap the last word you want to select.

OK, got it!

Thanks for trying Questia!

Please continue trying out our research tools, but please note, full functionality is available only to our active members.

Your work will be lost once you leave this Web page.

For full access in an ad-free environment, sign up now for a FREE, 1-day trial.

Already a member? Log in now.

Are you sure you want to delete this highlight?