Magazine article Training & Development Journal

Involving Managers in Training Evaluation

Magazine article Training & Development Journal

Involving Managers in Training Evaluation

Article excerpt

Involving Managers in Training Evaluation

Do your training programs suffer from predictable-results syndrome? Do you find yourself perplexed over which training programs will best suit the needs of your diverse employees? Do you have a good grasp of the value of your training? Most training managers don't, and the predictable-results training blues seem to be catching. Consider the following based-on-life situation, and discover a prescriptive method for overcoming training blues.

No one-size-fits-all training


After sending three of his line supervisors to an interpersonal-skills training course, Ian Keen, a senior manager at a medium-size tool-and-die plant, felt frustrated. The results were just as he would have predicted.

Gail Wilson, his top supervisor, made good use of the learning experience. But Gail was the type who could be stranded on a deserted island and still learn something. It didn't make much difference what she learned in the training; she'd apply it to the job somehow.

Then there was Walt McFarland - an average supervisor whose initial response to training was generally positive. When Walt attended training, he would say, "It's useful," or "I got a lot out of it." But back on the job, nothing happened that showed he was doing anything differently. Walt's track record never changed - no matter how good or bad the training was.

Ian's third supervisor, Anita Rodriguez, was an enigma. Sometimes she took initiative and used what she learned in training; other times she seemed to do worse after training. If Ian showed an interest in What Anita was doing, she'd improve, but only for a while.

Ian knew of no sure way to tell how training would affect his supervisors. With or without training, Gail would figure out the task and get it done. Walt always did the usual, which was the same as what he'd done the day before. And Anita might do the job just fine, or she might ask Ian to share in the work. Ian never could tell.

Are you sure that the training your employees receive is valuable or at least helpful? Do you have sufficient data or expertise to evaluate the training effectively? If Ian's problem sounds familiar, you could probably benefit from a simple, effective method that can help you gauge the real value of training. The secret? Involving employees in the process of selecting and evaluating training programs.

Training Impact


The Training Impact Assessment (TIA) is a process that requires managers to look collectively at what happens to their employees as a result of training. The thinking is that if most of the "Anitas" and even a few of the "Walts" use the training effectively, then the training has value. If the "Anitas" haven't changed or developed, then the training is probably ineffective. TIA helps eliminate managers' uncertainties about the effectiveness of training programs.

The TIA approach also results in managers supporting and committing to effective training. By communicating their support to higher management, they can influence executives who may not recognize effective training programs.

TIA has been used to evaluate programs for both technical and non-technical activities in Australia, Canada, South Africa, and the United States. In at least one case, the technique was used to evaluate a proposed marketing plan - with excellent results. In another case, a director of employee development who used TIA to assess a management-development program was personally rewarded with a bonus check for using the high-impact evaluation process.

Steps to success

The TIA method follows six steps. 1 Invite key clients to participate in the assessment sessions. In practice, the key client is often the trainee's boss. The boss is responsible for evaluating or gathering relevant data on the impact of the training on the job and on the unit or person. …

Author Advanced search


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.