Academic journal article American Journal of Pharmaceutical Education

Development of a Reliable, Valid Annual Skills Mastery Assessment Examination

Academic journal article American Journal of Pharmaceutical Education

Development of a Reliable, Valid Annual Skills Mastery Assessment Examination

Article excerpt

INTRODUCTION

Progress examinations are considered viable tools for pharmacy program assessment. ACPE 2007 Standards Guideline 15.1 states that PharmD programs should "incorporate periodic, psychometrically sound, comprehensive, knowledge-based, and performance-based formative and summative assessments including nationally standardized assessments." (1) Although a locally constructed examination may not be as psychometrically robust as a commercially prepared standardized examination, it has 3 potentially significant advantages: it can be tailored to assess the specific terminal ability-based outcomes (TABOs) of the PharmD program; the college or school has complete access to and control of the data; and the assessment program can provide useful formative student feedback.

Key considerations for developing a valid progress examination include how well the examination: includes important content; aligns content with the curriculum; reflects what should be learned; measures what it purports to measure; reflects individual student scores in a meaningful way; allows cost-effective production; and delivers results in a timely fashion. (2) Additionally, the examination should be reliable and reasonably valid as shown by content, thinking skills, internal, external, and consequential evidence.

The cognitive domain consists of all intellectual behaviors, including the 2 major categories of achievement and ability. Achievement refers to behavior that is easy to change, and includes 2 subcategories of knowledge and skills. Knowledge includes the facts, concepts, principles, and procedures that provide the core content of any curriculum. Skills are higher-order acts that require knowledge and involve performance in context. Ability refers to cognitive behavior that is more difficult to change. Ability is the long-term learning of a more complex behavior such as critical thinking or problem solving. (3) Ability develops from a foundation of knowledge and skills (Table 1).

Knowledge, skills, and abilities therefore exist on a continuum of increasing complexity. Performance of an ability requires the application of knowledge and skill, both of which can be learned and assessed within a short timeframe. The corresponding complex ability develops unpredictably and may not emerge until years later, potentially confounding the attempt of schools to measure the achievement of ability within the timeframe available to educators.

The American Educational Research Association (AERA) states that every high-stakes educational examination program should meet several conditions including: examinees should be protected against high-stakes decisions based on a single test; examinations should be validated for each use; likely negative consequences should be explained prior to examination administration; curriculum and test content should be in alignment; validity of passing scores should be verified; opportunities for remediation should be provided; sufficient reliability for each intended use should be measured; and an ongoing evaluation of consequences of the examination should be conducted. (4)

Because examinations are comprised of test items, proper item development is critical to ensure validity. Downing and Haladyna (5) described a quality assurance procedure to provide evidence of test validity through proper test item development. An ideal process documents how items are developed, how responses to the items are studied to ensure the test items are sound, and provides qualitative and quantitative forms of evidence. Table 2 summarizes the types of evidence required to make a reasonable claim of validity for an examination. The Outcomes Assessment Committee, comprised of the assistant dean for assessment (chair) and 4 faculty members, followed this guide to design the Annual Skill Mastery Assessment (ASMA) examination. The examination was developed, printed, and scored using LXR Test Software (Logic extension Resources, Georgetown, SC). …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.