Content Validity of a Clinical Education Performance Tool: The Physical Therapist Manual for the Assessment of Clinical Skills

Article excerpt

Background and Purpose: The content validity for evaluation tools currently used in student physical therapists' (PT) clinical education has not been reported previously. This study assessed the content validity of the Physical Therapist Manual for the Assessment of Clinical Skills (PT MACS). Subjects: Twenty-eight Academic Coordinators of Clinical Education were recruited from accredited professional PT education programs in the United States. Methods: A survey was developed to match PT MACS skills to criteria from The Guide and A Normative Model. The survey was mailed these Academic Coordinators, who indicated their level of agreement on how the skills were matched to the criteria. Results: The Academic Coordinators strongly agreed or agreed that 50 of 53 skills matched the criteria from The Guide and/or A Normative Model. They indicated with a visual analog scale that the PT MACS describes student behaviors needed for success in a clinical education experience. Conclusion: The PT MACS can be considered to have good content validity in describing the behaviors needed for success in a clinical education experience. J Allied Health 2005; 34:24-30.

ALLIED HEALTH and nursing professional education programs use various tools to evaluate the performance of students during clinical education experiences. Each academic program chooses a specific evaluation tool to judge their students' behavior and performance.1-4 As a high-stakes measure, the evaluation of the clinical experience should involve data from multiple sources, including the student's self-assessment and the assessment of behaviors from the cognitive, affective, and psychomotor domains.4-6 Few studies have been published that examine the clinical education tools in current use by physical therapist professional education programs, and those studies have dealt with userfriendliness7 or with the development of the tool or reports on earlier versions of a current tool.8

The choice of an evaluation instrument can affect the outcome and interpretation of the evaluation of the student in the clinical setting in a positive or negative fashion.4-6 Summative evaluations in clinical education serve two primary purposes: first, to determine the readiness of the student to progress in the program or to graduate; and second, to determine entry-level competence in order to protect the consumer.9 One challenge facing allied health and nursing educators is the lack of systematic studies to determine the validity and reliability of assessment instruments.4-6 There is a need to establish a guideline for determining content validity of summative evaluation tools used in the clinical education component of the allied health sciences. Although the tool that this study examined is specific to physical therapy, the process of establishing content validity could be applied to many fields in the allied health sciences.

Competent clinical practice, like other abstract constructs, can only be assessed by the observation of associated behaviors. Any instrument assessing competent clinical practice must include a balanced and representative sample of behaviors considered to be indicative of the profession as a whole.5 Using a tool with specific objectives that focuses on competent performance clarifies expectations and makes evaluation of clinical performance more objective.4,10 A clinical performance evaluation tool should have standardized tasks and instructions, pre-established identification of the critical aspects of the performance and the acceptable range of responses, and a standardized manner of scoring.1

Two of the most widely used tools in physical therapy (PT) clinical education have been the Blue MACS (Mastery and Assessment of Clinical Skills),11 developed by the Texas Consortium for Physical Therapy Clinical Education, Inc. (Texas Consortium) in 1979, and the Clinical Performance Instruments,12 developed by the American Physical Therapy Association in 1997. …


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.